SW or HW?
category: general [glöplog]
What's with the thread reviving?
actually, perhaps this thread is relevant again? Or will be in a year or so.. We now have software on hardware, i.e. cuda, intel's larrabee stuff, opencl.. so will it be traditional opengl/directx, or writing your own renderer to run on the gpu next year?
Fasttracker2!@_#)@_+
@okkie: It's art.
Hardware is softcore, software is hardcore.
Reviving an old war. One that I find interesting. Why?
SW till the day I die!! HW is for busters!
Quote:
Hardware is softcore, software is hardcore.
thumbup
I disagree.
anyone who has a strong opinion about either strategy obviously renders him- or herself irrelevance in that a big picture can't be seen?
come on, it's 2008, options are more complex than this stupid argument you're all having here.
come on, it's 2008, options are more complex than this stupid argument you're all having here.
niels, it's a 6 year old thread, to which you contributed a lot ;)
There's no "software" unless you print your shit on a piece of paper and interpret it bu yourself with a pencil or something. Yes, you pretty much get the idea how stupid and retarded this sounds.
Now harnessing new hardware is not as easy as some of you might think. Unless you take an SDK example source, add some overlays and release it as a demo. In which case you get thumbs down and a free dance with optimus (who actually codes(coded) software renderers to be fair!).
As I see it, Larrabee will be a bag of 8x86 cores, so you'll have to know how to write an efficient highly parallel code. While current hardware (gfx cards) do do some dirty work for you, as do drivers and glu/glut/d**dx/whatever, the reality is demoscene had shifted from purely technical effects to style and direction (which it actually never quite lacked in late 90s). So mr distorted mind (oh hello stefan) could never actually deliver on par.
On an unrelated note, if you still want to be honest with yourself and code all the additional libs and probably s/w rendering engine all by yourself, theres nothing to stop you. And you will most likely still get thumbs up. From me at least.
Now harnessing new hardware is not as easy as some of you might think. Unless you take an SDK example source, add some overlays and release it as a demo. In which case you get thumbs down and a free dance with optimus (who actually codes(coded) software renderers to be fair!).
As I see it, Larrabee will be a bag of 8x86 cores, so you'll have to know how to write an efficient highly parallel code. While current hardware (gfx cards) do do some dirty work for you, as do drivers and glu/glut/d**dx/whatever, the reality is demoscene had shifted from purely technical effects to style and direction (which it actually never quite lacked in late 90s). So mr distorted mind (oh hello stefan) could never actually deliver on par.
On an unrelated note, if you still want to be honest with yourself and code all the additional libs and probably s/w rendering engine all by yourself, theres nothing to stop you. And you will most likely still get thumbs up. From me at least.
Depends on how you define it, now more than ever.
To me, putting it very simply and in extreme black + white, I'd say the hardware route is taking dx/opengl, firing your models onto the video card, and letting the gpu render + light it. Software would be writing your own rasteriser, wether it's a traditional cpu based one that sends finished frames to the video card, or if it's a high end raytracing engine that runs on the gpu. In the middle you have custom shaders...
We're already seeing a lot of raytracing stuff done in 4ks on the gpu. Larabee/cuda/opencl will make much more complex stuff possible, i heard intel will be providing a realtime raytracing engine with larabee. Coding your own rendering engine for hardware like is pretty hardcore (probably more so than doing a traditional rasteriser), and the results should be pretty interesting!
To me, putting it very simply and in extreme black + white, I'd say the hardware route is taking dx/opengl, firing your models onto the video card, and letting the gpu render + light it. Software would be writing your own rasteriser, wether it's a traditional cpu based one that sends finished frames to the video card, or if it's a high end raytracing engine that runs on the gpu. In the middle you have custom shaders...
We're already seeing a lot of raytracing stuff done in 4ks on the gpu. Larabee/cuda/opencl will make much more complex stuff possible, i heard intel will be providing a realtime raytracing engine with larabee. Coding your own rendering engine for hardware like is pretty hardcore (probably more so than doing a traditional rasteriser), and the results should be pretty interesting!
HW = Hallo Was?
i guess lowlevel hw coding is lkinda ike fine Renaissance portrait painting. some people still paint hyper-real images, even though we now have photography. not out of necessity, but because they can and have a desire to touch the core of graphics programming and understand it on a different level. hats off to those guys still preserving these techniques. from what i am reading recently in the gpu press, many of these so called "obsolete" techniques may be required again...soooo, who's to say.
anyway, hats off to the new schoolers who only do what is required to help them push forward into new boundaries. everyone has his part to play. there is no wrong or right. so it's a silly conflict of subjective ideals....as usual.
anyway, hats off to the new schoolers who only do what is required to help them push forward into new boundaries. everyone has his part to play. there is no wrong or right. so it's a silly conflict of subjective ideals....as usual.
We use to name "software effects" on the CPC the effects not produced by the CRTC but the mid/newschool chunky effects, dots and 3d and stuff that the CPC hasn't seen much because everyone was insisting on doing bouncing rasters, hardware splits and scrollers in most CPC demo. Which could be a bit confusing because hardware effects are also written with code which is also software, also a good never seen before on the CPC software alike effect could be a good compination of software rendering and hardware tricks.
Whatever, I'd like to see more software rasterizers and especially 2d effects on the PC.
Whatever, I'd like to see more software rasterizers and especially 2d effects on the PC.
Quote:
some people still paint hyper-real images, even though we now have photography. not out of necessity, but because they can and have a desire to touch the core of graphics programming
rembrandt for best coder!
I liked Nsync when they were still underground!
I always thought New Kids on the Block were more edgy.
sxrebbel: and Chaos is one of the greatest portrait painters of the 17th century!
sxrebbel: and Chaos is one of the greatest portrait painters of the 17th century!
I'm going to see New Kids next year march! No lie! ^^
"if that last scene from 2ndReal can run on a C64@1mhz, I don't see why an 800 MHz unaccelerated PC can't do something 800 times as good. =P"
that is an animation on the c64 :) tho doing it realtime wont be much slower i guess.
that is an animation on the c64 :) tho doing it realtime wont be much slower i guess.
Optimus,
1, 6510 == 6502 + IO port. Data Direction register mapped to $01 and Data Register to $00. (iirc... on c64 its used to turn roms on off, and has nothing to do with IO :)
2, The drive has a 6502 inside (yes), and 2kb ram (iirc). you can upload code from the c64 into the drive ram and instruct the drive to run it. Every custom (fast) loader works that way, thats why you need "true drive emulation on". You can also upload a code which calculates a few vertices and passes the result to the c64 or whatever.
1, 6510 == 6502 + IO port. Data Direction register mapped to $01 and Data Register to $00. (iirc... on c64 its used to turn roms on off, and has nothing to do with IO :)
2, The drive has a 6502 inside (yes), and 2kb ram (iirc). you can upload code from the c64 into the drive ram and instruct the drive to run it. Every custom (fast) loader works that way, thats why you need "true drive emulation on". You can also upload a code which calculates a few vertices and passes the result to the c64 or whatever.
Even if it's years since my old post, this can be still interesting :)
Quote:
niels, it's a 6 year old thread, to which you contributed a lot ;)
haha yeah thanks for pointing that out :)
.