ArmA 3 Clan
Would you like to react to this message? Create an account in a few clicks or log in to continue.
ArmA 3 Clan

UK Based
 
Latest imagesPortalHomeSearchRegisterGalleryLog in

 

 Ray Tracing, Whats Around the Corner?

Go down 
AuthorMessage
[ABA]UNDIES
British Army Team
[ABA]UNDIES


Posts : 582
Join date : 2007-12-17
Age : 50
Location : England's green and pleasant lands

Ray Tracing, Whats Around the Corner? Empty
PostSubject: Ray Tracing, Whats Around the Corner?   Ray Tracing, Whats Around the Corner? Icon_minitimeSat 14 Jun 2008, 05:47

For those who have never heard of it, Ray Tracing is the new thing that GFX and game designers are headed towards. It was developed a few years ago but didnt get far because of the amount of processing power needed to render even simple scenes. It gives almost photorealistic images. cheers cheers cheers
Ray Tracing, Whats Around the Corner? 800px-Glasses_800_edit
Quote :
Ray tracing is a technique for rendering three-dimensional graphics with very complex light interactions. This means you can create pictures full of mirrors, transparent surfaces, and shadows, with stunning results.

All the big firms have denied interest in Ray Tracing until Intel piped up about its forthcoming Larrabee GPU which will be their first real foray into the 3d GFX market for some time. As we all know, their gfx chips are generally a bag of balls and relegated to laptops and media centres.
Quote :
Larrabee has been the subject of much debate since Intel’s CEO, Paul Otellini casually said that it would move the company 'into discrete graphics' at the Intel Developer Forum last year. Until now, all we knew was that it was going to feature multiple x86 cores, and many assumed that it was going to push ray tracing into the mainstream. However, Intel has now finally given us some more details on the mysterious new chip.

Featuring many IA++ (Intel Architecture - basically x86) cores, Intel claims that Larrabee will be scalable to TeraFlops of processing power, while Intel’s new vector instruction set is capable of both floating point and integer calculations. As well as this, Intel says the chip will feature a new cache architecture, although no details of this have been specified yet.

What’s interesting is that Intel sees Larrabee as a strong alternative to today’s traditional GPUs. In a presentation slide shown at a US press conference yesterday, Intel listed ‘triangle/rasterisation’ and ‘rigid pipeline architecture’ as problems with today’s GPUs. However, it listed ‘life-like Rendering e.g. Global Illumination’ as a benefit of Larrabee.

Considering that Global Illumination is a part of DirectX 10.1, and is supported by ATI’s latest Radeon HD 3000-series GPUs, which also feature multiple stream processors rather than traditional pixel pipelines, you could think of this as pretty rich. However, the fact that the cores are based on Intel’s x86 architecture with a new vector processing unit, rather than being simple scalar stream processors, could mean that the chip is capable of some impressive calculations.

One example is physics, and Intel claims that current mainstream graphics cards are ‘inefficient for non-graphics computing’ such as this. Intel sees the programmable and ubiquitous nature of the x86 cores as a big benefit of Larrabee over traditional GPUs, although the company also says that Larrabee will function with DirectX and OpenGL, so it will still need to be able to perform traditional rasterisation in games.
Ray Tracing, Whats Around the Corner? It_photo_84252_26
Ray Tracing, Whats Around the Corner? It_photo_84255_26
http://www.custompc.co.uk/news/602265/intel_reveals_details_of_larrabee_graphics_chip.html#
After Intel announced Larrabee, ATI and Nvidia, who always denied it was the way forward, jumped on the bandwagon, with the advent of multiple core CPU's and Multi GPU GFX cards the possibility is now real. Nvidia promptly purchased Ray Tracing specialists 'Rayscale', to get up to speed. http://rayscale.com/ http://www.custompc.co.uk/news/602628/nvidia_buys_ray_tracing_specialist.html

Now Intel has managed to 'Ray Trace' the game ET Quake Wars, running at 15fps on a res720P reolution. This was on a machine with a 16-core processor running at 2.93GHz!!! http://www.custompc.co.uk/news/602775/intel-ray-traces-enemy-territory-quake-wars.html

This processing power is beyond the reach of even top end dual and quad cores, but bear in mind Intel and AMD will both keep doubling cores, with Intel only just releasing their 8 core Skulltrail platform, we are halfway there. affraid

The futures bright, the futures Ray Tracing Smile

(And all our GFX cards will be old hat and we'll need to buy new ones Sad )
Back to top Go down
http://www.krh.org.uk
[ABA]UNDIES
British Army Team
[ABA]UNDIES


Posts : 582
Join date : 2007-12-17
Age : 50
Location : England's green and pleasant lands

Ray Tracing, Whats Around the Corner? Empty
PostSubject: Re: Ray Tracing, Whats Around the Corner?   Ray Tracing, Whats Around the Corner? Icon_minitimeSat 12 Jul 2008, 05:48

Bit more a Larrabee for you, rumours suggesta a32 core discrete card based on old CPU architecture.

Quote :
Since we wrote this story yesterday, Intel has been in touch to point out that Heise was speculating about the type and number of cores used in Larrabee, and that neither Justin Rattner or Pat Gelsinger announces any details about the type or number of cores used in Larrabee.

German tech site Heise has speculated that Larrabee’s multiple IA cores will in fact be based on Intel’s P54C architecture, which was last seen in the original Pentium chips, such as the Pentium 75, in the early 1990s.

Of course, the cores will be a bit more sophisticated than that, and much smaller, as they will be fabricated on a 45nm process. Heise also reckons that the cores will feature a 512-bit wide SIMD (single instruction, multiple data) vector processing unit. The site calculates that 32 such cores at 2GHz could make for a massive total of 2TFLOPS of processing power.

Expreview also has a diagram of Larrabee which shows the layout of the PCB. The card features one 150W power connector, as well as a 75W connector. Heise deduces that this results in a total power consumption of 300W, along with 75W from the PCI-E slot, although it’s extremely unlikely that the card will max out every single power source. Even so, it’s going to eat a lot of power.

http://www.custompc.co.uk/news/602910/updated-rumour-control-larrabee-based-on-32-original-pentium-cores.html

Quote :
After giving P54C to the American military, Intel apparently now has the core back, and it’s been thoroughly debugged with a much smaller footprint



As you may have noticed, we recently updated a Rumour Control story about Larrabee being based on several P54C (original Pentium) cores after a response from Intel. Interestingly, however, Intel didn’t deny the rumour, but simply pointed out that it was German tech site Heise who suggested that Larrabee would be based on P54C, rather than Pat Gelsinger. Interestingly, though, another tech site, Ars Technica, claims to know that Larrabee will indeed be based on P54C.

The site claims that it’s been ‘sitting on’ the same information since last year, and reiterates that ‘Larrabee is, in essence, a bunch of P54C (i.e. pre-MMX) Pentium cores that have been enhanced with very wide vector floating-point resources.’ What’s more interesting, though, is the reason why.

According to Ars Technica, when the old P54C core was past its prime, Intel gave the core’s RTL code to the Pentagon so that the American military could develop a ‘radiation hardened’ version for military applications. Apparently, the Pentagon, which has its own facilities for chip fabrication, then worked on cleaning up the core’s RTL code, before offering it back to Intel when it was no longer useful. After the work from the Pentagon, Ars Technica claims that the P54C now ‘has a very small footprint, and has also been ‘pretty thoroughly debugged.’

Intel has now ‘modified it for use in the many-core chip that later became Larrabee,’ says the site. Of course, this is all still very much a rumour, but Heise’s speculation could well be on the ball. Larrabee is due to be shown at Siggraph later this year, and Intel has already revealed a few details here. According to the Siggraph write-up, ‘Larrabee uses multiple in-order x86 CPU cores that are augmented by a wide vector processor unit,’ which could well mean multiple P54C (also an in-order x86 architecture) cores with a 512-bit wide SIMD (single instruction, multiple data) vector processing unit, as suggested in our original story.

The Siggraph write-up continues, saying that Larrabee ‘provides dramatically higher performance per watt and per unit of area than out-of-order CPUs on highly parallel workloads and greatly increases the flexibility and programmability of the architecture as compared to standard GPUs.’

http://www.custompc.co.uk/news/604419/rumour-control-why-larrabee-will-be-based-on-p54c.html
Back to top Go down
http://www.krh.org.uk
[ABA]UNDIES
British Army Team
[ABA]UNDIES


Posts : 582
Join date : 2007-12-17
Age : 50
Location : England's green and pleasant lands

Ray Tracing, Whats Around the Corner? Empty
PostSubject: Re: Ray Tracing, Whats Around the Corner?   Ray Tracing, Whats Around the Corner? Icon_minitimeFri 08 Aug 2008, 06:23

Larrabbee seems to have more bits creeping out as time goes on.

Ray Tracing, Whats Around the Corner? Multicores

Quote :
An alternative to developing faster—but hotter—processors, Larrabee will have between 16 and 48 processor cores aboard, all compatible with the classic x86 instruction set.

This massively-parallel architecture is ideally suited to gaming systems, of course, but Intel plans on extending its usefulness into the handheld and even supercomputing domains. Larrabee's chief designer puts the new chip architecture "on the level of the 432 or the Itanium.” It'll be competing against next-gen chips from Nvidia and ATI, which will have between 256 and 800 cores, so Larrabee is relying on its "high speed ring" which interconnects cores more efficiently than current designs. Should be available in late 2009 or early 2010. Interesting stuff.

http://gizmodo.com/5032612/intels-larrabee-multi+core-gpu-chips-get-detail-timescale

Quote :
The company said it would initially aim Larrabee at the personal-computer graphics market, where its “many-core” design, with more than a dozen and eventually hundreds of processing units on a single silicon chip, would be especially useful.

But Anwar Ghuloum, an Intel parallel computing engineer, said that over the next half-decade Intel planned to make the chip design available to an increasingly broad spectrum of the computing world, from Windows and Macintosh desktop personal computers to handhelds and even supercomputers.

The market for add-on graphics accelerators, which are prized by PC game players, is now dominated by Nvidia and the ATI division of AMD. Intel’s approach will be distinguished by its reliance on the industry standard x86 instruction set, which will allow the chips to take advantage of a huge library of existing software.

In 2004, after finding that it could not make its chips faster because they were overheating, Intel adopted a strategy it referred to as a “right-hand turn.” It switched to improving performance by increasing the number of processing elements, or cores, on each chip. That led first to dual-core and now quad-core chips.

Analysts said the first generation of Larrabee may have 16 to 48 cores, depending on the performance goal.

Intel has tried several approaches to chip design, but none of them have had the impact of its x86 family, which was originally introduced three decades ago. Architectures that have been less successful include the Itanium and the 432, neither of which was adopted in mainstream computing.

Now the company is hoping Larrabee will succeed where those other designs have failed.

“This is on the level of the 432 or the Itanium,” said Doug Carmean, the chief architect for Larrabee. “We’ve had to learn from those experiences.”

In the near term, Intel is hoping that it will be able to open a wedge in the market for graphics chips, which was a roughly $7 billion market in 2007 and is growing modestly.

In contrast to ATI and Nvidia, Intel would seem to be at a significant disadvantage in the number of cores it is offering. Next-generation ATI and Nvidia chips will have 800 and 256 cores respectively, said Jon Peddie, president of Jon Peddie Research in Tiburon, Calif. He said Intel, however, was focusing on an approach known as ray tracing, which can be used to add realism to games and animation products.

Ray tracing requires efficient processor-to-processor communication, he said, and the Larrabee design has a high-speed ring that connects its processors more effectively than current graphics accelerators.

He estimates that if the company’s strategy succeeds, it could capture as much as a third of the graphics add-on market in 2010, which might be worth $4.6 billion.


http://www.nytimes.com/2008/08/04/technology/04intel.html?_r=1&ref=technology&oref=slogin

Ray Tracing, Whats Around the Corner? Lbpage1

Quote :
It started with Intel quietly (but not too quietly) informing many in the industry of its plans to enter the graphics market with something called Larrabee.

NVIDIA responded by quietly (but not too quietly) criticizing the nonexistant Larrabee.

What we've seen for the past several months has been little more than jabs thrown back and forth, admittedly with NVIDIA being a little more public with its swings. Today is a big day, without discussing competing architectures, Intel is publicly unveiling, for the first time, the basis of its Larrabee GPU architecture.


Well, it is important to keep in mind that this is first and foremost NOT a GPU. It's a CPU. A many-core CPU that is optimized for data-parallel processing. What's the difference? Well, there is very little fixed function hardware, and the hardware is targeted to run general purpose code as easily as possible. The bottom lines is that Intel can make this very wide many-core CPU look like a GPU by implementing software libraries to handle DirectX and OpenGL.

It's not quite emulating a GPU as it is directly implementing functionality on a data-parallel CPU that would normally be done on dedicated hardware. And developers will not be limited to just DirectX and OpenGL: this hardware can take pure software renderers and run them as if the hardware was designed specifically for that code.

There is quite a bit here, so let's just jump right in.

http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=3367
Back to top Go down
http://www.krh.org.uk
[ABA]UNDIES
British Army Team
[ABA]UNDIES


Posts : 582
Join date : 2007-12-17
Age : 50
Location : England's green and pleasant lands

Ray Tracing, Whats Around the Corner? Empty
PostSubject: Re: Ray Tracing, Whats Around the Corner?   Ray Tracing, Whats Around the Corner? Icon_minitimeFri 15 Aug 2008, 21:49

Ray Tracing, Whats Around the Corner? It_photo_99273_26
Ray Tracing, Whats Around the Corner? It_photo_99270_26
Ray Tracing, Whats Around the Corner? It_photo_99276_26
Quote :
Intel’s Daniel Pohl hints at Larrabee’s ray tracing capabilities, and says he believes that rasterisation will be replaced in the next few years



Intel may have revealed the theory behind its forthcoming Larrabee graphics architecture, but so far it hasn’t revealed much beyond its DirectX and OpenGL capabilities. However, there’s little doubt that if it catches on, a freely programmable x86 graphics architecture could have massive implications for the future of games. In particular, Intel now says that rasterisation will start dying out in the next few years.

Speaking to Custom PC, Intel’s ray tracing guru Daniel Pohl said that: ‘Looking ahead five to ten years from now, I believe that rasterisation will be used less and less in games, and will be replaced by other algorithms and/or combinations of them.’

Pohl also hinted at some of the ray tracing capabilities of Larrabee. ‘Besides being a rasteriser with DirectX and OpenGL support,’ explained Pohl, ‘Larrabee will also be a freely programmable x86-architecture, so you’ll be able to do real-time ray tracing.’ However, the more interesting question, according to Pohl, is ‘with what scene complexity, in what resolution, with what kind of light settings and with how many dynamic objects it will be possible to do that.’






Real time ray tracing is the talk of the town at the moment, and not only with regards to Intel. At Siggraph this week, Nvidia also demonstrated real-time ray tracing on GPUs. Of course, shifting the games industry away from rasterisation will be a big job, and one that is likely to take many years, but Intel clearly sees this as the way of the future. Is Intel on the right track here? Is ray tracing going to play a big part in games in the future? As always, let us know your thoughts in the comments section below.

Look out for a full interview with Daniel Pohl about ray tracing and the future of games next week.

http://www.custompc.co.uk/news/604653/intel-rasterisation-will-be-replaced-in-five-years.html#

Quote :
Quadro Plex system with four GPUs runs automotive ray traced demo at 30fps at 1,920 x 1,080



If rasterisation does indeed die out in favour of ray tracing, as Intel believes it could, then where does this leave GPUs as we know them? Well, if Nvidia’s latest demo at Siggraph is anything to go by, traditional GPUs could still be more than capable of real-time ray tracing using CUDA.

Nvidia claims that it’s just demonstrated ‘the world's first fully interactive GPU-based ray tracer’ at the show, which was achieved using a Quadro Plex 2100 D4 Visual Computing System (VCS) containing four Quadro GPUs, each with 1GB of memory.

Nvidia says that ‘the ray tracer shows linear scaling rendering of a highly complex, two-million polygon, anti-aliased automotive styling application.’ Impressively, Nvidia claims that the polished car demo runs at 30fps at 1,920 x 1,080, and includes ‘an image-based lighting paint shader, ray traced shadows, and reflections and refractions’ at three bounces.

Nvidia also demonstrated the demo running at 2,560 x 1,600, although frame rates have not been revealed at this resolution. Even so, the demo makes for a visual treat at this resolution, which demonstrates the gorgeous reflective potential of ray traced games. You can download full 2,560 x 1,600 screenshots of the demo below to see for yourself:


Nvidia GPU ray tracing 2,560 x 1,600 screenshot 1
Nvidia GPU ray tracing 2,560 x 1,600 screenshot 2
Of course, this is still very much a tech demo at the moment, and it will probably be a long while before we ever see mainstream games using this technology. Nvidia will release its new D series of Quadro Plex systems next month, starting from a price of $10,750 US (£5,788).

http://www.custompc.co.uk/news/604656/nvidia-demos-real-time-gpu-ray-tracing-at-1920-x-1080.html
Back to top Go down
http://www.krh.org.uk
Sponsored content





Ray Tracing, Whats Around the Corner? Empty
PostSubject: Re: Ray Tracing, Whats Around the Corner?   Ray Tracing, Whats Around the Corner? Icon_minitime

Back to top Go down
 
Ray Tracing, Whats Around the Corner?
Back to top 
Page 1 of 1

Permissions in this forum:You cannot reply to topics in this forum
ArmA 3 Clan :: Public :: Tony's Geeky Toys & Computer Area-
Jump to: