At its annual keynote at GDC 2019, Unity Technologies debuted a swath of improvements to its popular Unity game engine and development tools. This was business as usual for the company, whose presence at the show is a given year after year, especially as the pre-eminent tool for mobile game and independent video game development. However, while speaker after speaker presented a host of changes and improvements aimed at helping developers get better results on phones and tablets, Unity appeared to have another goal in mind.
Epic Studios Unreal Engine 4 has become the leading platform for current generation console and PC development, though it hasn’t found the same level of market penetration in mobile. Unity Technologies suite has, fairly or not, been seen as a more entry-level tool, or an engine for smaller, less graphically and systemically sophisticated games. Much of Unity’s presentation at this year’s GDC was oriented around marquis graphical features that appear poised to put the engine at parity with Epic’s Unreal Engine, seeking to shift the existing narrative around Unity as developers and publishers contend with what Unity CEO John Riccitiello referred to as unannounced platforms. Riccitiello specified Google’s cloud gaming initiative, set to debut this week at GDC, but it seems all but assured that he was also alluding to next-generation platforms from Sony and Microsoft.
This seemed clearest as various Unity presenters elaborated on what the company calls its High Definition Render Pipeline, or HDRP, which supports high end features including physically based rendering, which allows developers to work with materials in-game that mimic the qualities and appearance of real materials, simulating the behavior of light on those surfaces, among other things. Enhanced subsurface scattering support in Unity was also emphasized, which allows for more realistic organic materials including skin.
However, these are industry-standard features in 2019, particularly in AAA and AAAA development, with games supporting these techniques and technologies for several years. Similarly, Unity announced a partnership with physics middleware company Havok to greatly increase both Unity’s built-in physics and its integration with more dedicated Havok tools, which is once again fairly pedestrian in console and PC game development (and whose support on mobile platforms is limited by the power constraints of those devices).
Acknowledging this, however, Unity spent an extended period of time dedicated to what is likely to be the most-common development buzzword coming out of this year’s GDC: real-time raytracing. Raytracing is a technique by which light, color, shadows, and reflections are calculated with potentially dramatic increases in fidelity and realism by using virtual “rays” that emit from light sources and “bounce” around the scene in a manner similar to the way light behaves in the real world. Raytracing has until recently been seen as an impossibility to achieve in real time in video game engines, but work done by companies including Microsoft, graphics card developers Nvidia and AMD and numerous others has seemed to put the tech within reach.
Unity’s real-time raytracing solution was impressive if a bit fuzzy on the details of the hardware running it, but it signaled that Unity Technologies is no longer satisfied supporting cutting edge video game development trends and tech years after the fact.
Much of Unity’s keynote this year seemed oriented around this idea. Two of the main moments of the presentation revolved around ambitious projects outside of the generally accepted idea of Unity projects — Warren Spector and Otherside Games’ “System Shock 3” and Lorne Lanning and Oddworld Inhabitants’ “Oddworld Soulstorm.” These are projects that have high expectations around them, and not just from a gameplay perspective. The demonstrations for these titles emphasized high-end production values that didn’t seem like compromises made in light of Unity’s ease of use. They looked exciting and legitimately contemporary in a way Unity-powered titles are not typically expected to, outside of some sidescrolling titles like “Inside,” “Ori and the Blind Forest,” and “Cuphead.”
The message seemed to be a hybrid of Unity’s previously understood place in the game development pecking order and loftier ambitions. Unity reps returned again and again to the ways in which the engine and its accompanying tools and middleware could be used to sustain a game as a service and support a variety of business models, while preserving affordability and ease of use, including for distributed teams, all while achieving results that didn’t seem like the aforementioned compromise. Even Unity’s pitch on raytracing stressed the cost-benefits of its blend of raytraced visuals and more traditional approaches in order to save on the costs of asset creation, currently one of the most expensive resource drains in modern game development.
To cap off Unity’s pitch to game developers, many of whom are currently investigating and planning for next-generation development projects, it debuted its latest engine demo: “The Heretic.” A fully real-time graphical showcase of Unity’s most cutting edge tools, it’s an impressive example of what can be achieved, and, potentially, what gamers may be able to get their hands on in the years ahead.