View Full Version : opengl, in depth
I started to watch these webcasts. It is the most thorough information I have come across on the working of opengl and why things are done the way they are.
http://www.ece.ucdavis.edu/courses/W08/EEC277/
Go down to the syllabus section down the page to see the webcasts.
The links aren't working tonight, I did write the professor about it. You can use the webcast titles and find the videos on youtube from UCDAVIS's youtube page:
http://www.youtube.com/user/UCDavis#p/u
Here is the course intro, there is lots of noise the first 30 seconds, so just mute your speakers till the 30 second mark. This will give you a good idea of all that is in the course and how in depth this course is:
http://www.youtube.com/user/UCDavis#p/u/53/R2jMkg1TpEM
update: The professor, John Owens, did write back tonight... apparently the website links will no longer work. He said the videos are also available on itunes. I am finding them as I wrote above on youtube.
Petr Schreiber
27-04-2010, 08:56
Thanks Kent,
the first 30 seconds of noise sound like slowed down speech of aliens from District 9.
Let us know if you find the videos online.
Petr
They are on youtube Petr. Using the syllabus lecture names, you can search for ucdavis and few of the words from the lecture title and it pops up.
One time it brought them up as a playlist, but I have not been able to get it to do it again.
They cover some really low level stuff that is fascinating. This is a graduate course for engineers I think as it seems they talk about the guys designing processors and graphics cards. But cool things for even novices like me. How and where graphic cards are going and why, it makes a lot of sense and is fascinating at the levels of thought that goes into all of this.
The benefits of for instance software rasterizing that intel is planning compared to hardware rasterizers that nvidia is going with. Why the fixed pipeline is going to more programmable in the order the programmers wants. So if this sort of thing is interesting to you then this course is really great!
Petr Schreiber
28-04-2010, 19:17
Oh,
silly me, will do :)
Regarding Intels promises about their revolutionary raytracing machine... I hear that story for years. I will believe it once I can try it on my own. Better they hurry, as there is more and more GPU raytracing solutions spawning up already...
Programmable pipeline sure means the best way for the best graphics! One the other side it seems we are going back in time, when programmer needs to code everything himself, have good knowledge of various visual phenomenons, math...
Petr
From what I can tell, Petr. The fixed pipeline will still be available, but if you want to do other things in custom ways that will be more and more available. That sounds good to me.
I don't know if you saw the lecture about why we are stuck in the 3ghz cpu range for so many years and going multi-core, but now I don't get as angry thinking about why we have been stuck knowing why. Also the talk about bandwidth and latency was enlightening. I also liked when he talked about how the speed of light is a limiting factor in designs now. That is really amazing all the thought that goes into all of this stuff. Who would have thought the speed of light was a limit we are running into already?
I was also surprised to see the Nintendo 64 had a graphics chip named the flipper made by ATI. It had framebuffers built into the GPU, that was wild to see.
I had never heard of beam chasing technique, that was really interesting too and very clever idea they came up to solve the solution back then. Lots of things are first time I am running into them, so this course is really exciting. The list of guest speakers in amazing too, too bad they are not available. The nvidia speaker I guess agreed to let them tape his talk, but I have yet to find it.
Charles Pegge
29-04-2010, 08:25
Electric currents and light travel at a maximum of around 3E8 metres/Second. So at 3 Gigaherz the max distance an electric signal can travel is 0.1 metre, (4 inches). This implies that to remain in sync path lengths for signals have to be less than 1 inch (1/4 wavelength) of each other. This is on the scale of a Pentium wafer!
Charles
It is amazing to think about.. and the fact that we are developing things that are pushing and hitting limits caused by the speed of light, it boggles the mind!