Putting The RT In Art

Share on facebook
Facebook
Share on twitter
Twitter
Share on pinterest
Pinterest
Share on linkedin
LinkedIn

 

The battle for dominance between archrivals Nvidia and AMD should come as no surprise to those who keep tabs on developments in graphics hardware. Both companies enjoy decades-long legacies of making powerful graphics cards (or as they are called nowadays, graphics processing units or GPUs) for a myriad of uses than expected. In a way, their relationship has almost been a symbiotic one; rivalry being the propeller of growth through each generation.

AMD entered the semiconductor industry as a developer of CPUs decades ago. It was with the acquisition of fellow semiconductor company ATI Technologies in 2006 that AMD entered the GPU market. ATI had already been a major player in the GPU arena by that time, trading blows with Nvidia, which had by then already acquired once-rival 3dfx Interactive. Moreover, AMD’s acquisition of ATI did nothing to change that.

However, taking a closer look at both companies show how different they are at their core.

Why is Nvidia’s RTX series such a game changer?
AMD is still building outstanding CPUs and GPUs for regular consumers and businesses. But their GPUs are still primarily gaming-centric, barring their Radeon Pro range of workstation cards. On the other hand, Nvidia is not making CPUs and seems to be thinking along more esoteric yet futuristic lines. For instance, Nvidia also produces a special kind of GPU-like processor called an ‘inference accelerator’ along with their regular range of gaming (GeForce) and workstation (Quadro) GPUs. It can be used by data scientists and researchers to perform high-speed scientific simulations and train artificial intelligence programs through a process called ‘deep learning’.

Towards the end of August 2018, everything changed. Nvidia proved just how staggeringly distinct they are, by revealing something at the Gamescom 2018 conference that was previously considered impossible or at least something of a utopian holy grail – real-time ray-tracing (RT) in a consumer-grade graphics card. To uninitiated ears, that may sound like yet another fancy throwaway feature, but here is why it is so unique.

To understand the idea behind ray-tracing, one needs to know a bit about light. Anyone with a rudimentary understanding of physics remembers that light is composed of rays of photons, which are emitted from a luminous object (e.g. the filament of a lightbulb). These rays strike objects in its path, which absorb some of the light and reflect the rest of it, and these reflected rays strike other objects in their path. This process continues until the rays are sufficiently diffused. For example, if a flashlight is shined on a tabletop, part of its light would be reflected on its surroundings, from which it would be reflected on other nearby objects until it goes too far to make a visible difference.

In order to simulate something using 3D graphics, this reflective behavior of light (as well as the shapes and textures of the objects it comes in contact with) needs to be simulated using a computer to make the 3D objects accurately visible. This process of simulating the paths of the light rays is called ray-tracing. However, this is an immensely resource-intensive process, and rendering a single frame of a 3D scene can take hours even on a professional render farm (an array of computers working together to accomplish particularly complex tasks).

3D graphics are everywhere nowadays, all the way from films to video games. But the difference in graphics quality in movies and video games is always a noticeably high one. Even with game graphics getting steadily better with game engines gaining new features and optimizations, it still lacks physical accuracy and relies a great deal on pre-calculations and optical illusions instead of actual ray-tracing to provide an approximation of realism. In order to make a game properly playable, the viewer needs to see at least 30 frames of action in a second in real time, which cannot be achieved through ray-tracing. Until now.

Codenamed ‘Turing’, Nvidia’s new RTX 2000 range of GPUs is capable of performing true ray-tracing in real time at up to a mind-blowing 60 frames a second – which used to take hours to complete for each frame. This is still first-generation technology, and it can only get better with upcoming iterations, and with increasing support from game developers who would take advantage of the new technology to max out the potential of their game engines. Realizing the dream of photorealistic real-time graphics was never as close to fruition as it is now.

Of course, to a hardcore gamer who expects over a hundred frames per second in a game, that probably sounds paltry, and it caused an expected uproar in the gaming community. The high prices of the RTX cards is also a matter of concern for many gamers, starting at $600 (for the RTX 2070) and peaking at $1,200 (for the RTX 2080), which is almost prohibitively high, especially considering that the regular (non-ray-traced) gaming performance of the cards, while still very high, is lower than expected. Nonetheless, what is being ignored in these sad rants is that these ‘better shadows and reflections’ spell the difference between photorealism and a potential uncanny valley. Pulling this off at 60 FPS is a feat that was previously undreamt of, which is a perfectly legitimate excuse for the premium pricing.

The paradigm shift caused by the arrival of real-time ray-tracing is nothing short of seismic, as its implications extend far beyond the realms of gaming. Virtual reality can now look more real than ever. The impact of the RTX cards would spell the greatest difference in the field of design and media production. Users would be able to build and view production-grade 3D renders and make changes on the fly without waiting and get more realistic results than ever, easily saving millions of production hours and completely revolutionizing the field. Animation giant Pixar is already in talks with Nvidia about using RTX GPUs for rendering their upcoming productions at a fraction of the time previously taken, and it can be expected that many other movie and video game studios will soon follow suit.

The demo videos of RTX abilities displayed by Nvidia during Gamescom floored viewers around the world. While the GPUs are not expected to go on sale before September, it is safe to say that the hype behind them is only too real – and entirely justified. It has never been a better time to be a 3D artist or a gamer than it is now, at the dawn of the real-time ray-tracing age.

Share:

Share on facebook
Facebook
Share on twitter
Twitter
Share on pinterest
Pinterest
Share on linkedin
LinkedIn
On Key

Related Posts

COMPUTE, CREATE, CELEBRATE

An overview of a four-day computer science event organised by AIUB Computer Club. AIUB CS Fest 2024, presented by the Office of Student Affairs and

LOOKS SO REAL

The exciting possibilities of OpenAI’s text-to-video AI model, Sora. Just a year ago, an AI-generated video featuring a grotesquely deformed approximation of Will Smith enthusiastically

HUMBLE LEGACY

For over 170 years, the legacy firms that make up PricewaterhouseCoopers International Limited, a multinational professional service brand of firms, operating as partnerships under the

LEADERSHIP LESSONS FROM A CXO

As a CXO, you are at a level where you have tough contenders and practical challenges. What could separate you from the rest? Here are

Leave a Reply

Your email address will not be published.