The best female computer geek ever!!!
category: general [glöplog]
Maybe I need a new gfx card for all the fun pixel shaders ;)
eye: what kind of 70€ card did you buy?
eye: what kind of 70€ card did you buy?
stelthz: since i don't obey to ATI propaganda, it was a Gainward based on NVidias FX5200. i am proud of having been brainwashed early and thoroughly! i have to overclock it to almost normal speed (it has 6ns memory instead of usual 5ns), but hey, when i burn it, i get a better one for 70 EUR as well!!! It supports 2 monitors and both DACs are good, the image doesn't have waves, blurs, etc, so i'm 99% satisfied. and it's nontheless faster than the best card i could get 2 years ago. ;)
I heard the FX5200 is crippling slow :) How about the ati 9600SE?
"Oh, and I don't think you or most other BP visitors know the "shit" she was talking about better than her. That woman is darn competent. At least she writes articles for graphics and game programming books when U don't. :)"
I've no idea what she knows or doesnt know, but I *do* know that 99% of all graphicsprogramming books out there are complete CRAP.. Is she amongst the 1% authors? If so, how come she acted as a complete saleswoman who doesnt have a clue and making shitloads of faulty statements (I dont remember exactly WHAT she said, i remembered we felt sad for ATI to spoil such a good oppurtunity as they had and then we left the hall to have some fun instead)..
Oh, and showing up techdemos that are over a year old is not hip. Esp as they where bad already back then ;)
I've no idea what she knows or doesnt know, but I *do* know that 99% of all graphicsprogramming books out there are complete CRAP.. Is she amongst the 1% authors? If so, how come she acted as a complete saleswoman who doesnt have a clue and making shitloads of faulty statements (I dont remember exactly WHAT she said, i remembered we felt sad for ATI to spoil such a good oppurtunity as they had and then we left the hall to have some fun instead)..
Oh, and showing up techdemos that are over a year old is not hip. Esp as they where bad already back then ;)
It is too slow for gamers, but i'm too blind to use higher resolutions anyway! Besides, ATI doesn't scale its performance as well as NV cards. That is, you don't have that "ugly, slow, and ultra-slow" modes on it! Yikes! :) No, demos from Kolor and others were the deciding argument, because they don't run on ATI. But hey, big respect to ATI because They Have Accelerators That Can Beat Our Accelerors! If you like to spend a bit more, you get a FX5600, with a noisy fan and it's faster and has less visual bugs.
BTW, i think the thermal design is flawed. So would be better - imagine a thick (0.5-1 cm) solid block of copper layed on the chip. The card has small depth and greater length. The block stands out far outside the card border and has verticall holes made into it. This would, unlike current systems, cause significant convection and would probably make a fan unrequiered for FX5600 level cards. Though it definately cools worse than a fan, it definately could be made cool better than Zalman's heatpipe, which is more than enough for FX5600. I have a nearly noiseless PC, and i would have nothing against a better gfx card in it.
BTW, i think the thermal design is flawed. So would be better - imagine a thick (0.5-1 cm) solid block of copper layed on the chip. The card has small depth and greater length. The block stands out far outside the card border and has verticall holes made into it. This would, unlike current systems, cause significant convection and would probably make a fan unrequiered for FX5600 level cards. Though it definately cools worse than a fan, it definately could be made cool better than Zalman's heatpipe, which is more than enough for FX5600. I have a nearly noiseless PC, and i would have nothing against a better gfx card in it.
At least if it wasn't of a great interest to coders it was entertaining in its own weird fucked up way :)
"weren't we doing that by software in 97?", funniest quote ever!!! (at scene.org poll =)
Quote:
arneweisse: actually when we talked about what the show should become like before I made the "mistake" to tell her that a big part of the demo scene is quite reluctant to use shaders. To picture that as if the whole demoscene was stuck in the DX7 era was her own idea. So Stelthz is quite right there ;)
Oh, and I don't think you or most other BP visitors know the "shit" she was talking about better than her. That woman is darn competent. At least she writes articles for graphics and game programming books when U don't. :)
Word up KB!
eye: uhm well.. that is totally off topic. But why do you think a thick copper block is better? It just adds thermal mass, but the free surface is not increased. Besides that, I think the card designers have engineers able to simulate and handle simple steady state convection problems :)
arne: Are you 15? Maybe it is time to get a bit more objective? There exists knowledge outside of the demoscene.. and not all books that do not have you as target audience are crap.
I really hate this black&white judging..
I really hate this black&white judging..
Though I wouldn't exactly put it that way, I do agree with what Stelthz is trying to say here.
Actually, you'd be surprised how many stuff that's so darn obvious 'for us' is new to others.. I recall yawning my freaking ass off at some ATI talk about postprocessing effects last year. And so did a few other ex-sceners around me.. The rest of the people were busy paying attention and writing stuff down :)
Actually, you'd be surprised how many stuff that's so darn obvious 'for us' is new to others.. I recall yawning my freaking ass off at some ATI talk about postprocessing effects last year. And so did a few other ex-sceners around me.. The rest of the people were busy paying attention and writing stuff down :)
stelthz:
You think I believe the demoscene is before ATI developers? Seriously? Demoscene is like *behind* gamedevelopment in general, and esp. behind the big graphiccards companies, no question about that.
What we are discussing ONE performance of ONE person from ATI (?) and a bad one. I know nothing about what she does/have done, but the fact remains about the litterature about realtime graphics..
Statements like "small and very flexible" to tell it fits into an intro makes me laugh.. Is this why we have to generate zillions of shaders because there is no loops in <3.0 shaders? Is that flexible? One shader for every light combo.. Right.. Just check how many shaders they generate in for instance the HL2 spread sources and youll see yourself how "small and flexible" they are..
You think I believe the demoscene is before ATI developers? Seriously? Demoscene is like *behind* gamedevelopment in general, and esp. behind the big graphiccards companies, no question about that.
What we are discussing ONE performance of ONE person from ATI (?) and a bad one. I know nothing about what she does/have done, but the fact remains about the litterature about realtime graphics..
Statements like "small and very flexible" to tell it fits into an intro makes me laugh.. Is this why we have to generate zillions of shaders because there is no loops in <3.0 shaders? Is that flexible? One shader for every light combo.. Right.. Just check how many shaders they generate in for instance the HL2 spread sources and youll see yourself how "small and flexible" they are..
True that.
And that's just "default"..
Quote:
NxShaderBuilder 1.01
Now processing: c:\<undisclosed>\engine\src\renderers\Xbox\Shaders\Default.nxShader
Compiled shaders summary: 134KB, 260 Unique, 1024 Combinations, 4096B Table
And that's just "default"..
[quote="superplek"]Though I wouldn't exactly put it that way, I do agree with what Stelthz is trying to say here.
Actually, you'd be surprised how many stuff that's so darn obvious 'for us' is new to others.. I recall yawning my freaking ass off at some ATI talk about postprocessing effects last year. And so did a few other ex-sceners around me.. The rest of the people were busy paying attention and writing stuff down :)[/quote]
Am i misintepreting what stelthz are saying or wasnt he saying the opposite?
Actually, you'd be surprised how many stuff that's so darn obvious 'for us' is new to others.. I recall yawning my freaking ass off at some ATI talk about postprocessing effects last year. And so did a few other ex-sceners around me.. The rest of the people were busy paying attention and writing stuff down :)[/quote]
Am i misintepreting what stelthz are saying or wasnt he saying the opposite?
And we dont have real BBCode here :(
Stefaqn: It's just some flaky stuff about everyone and everything having it's place in 'the system' I
guess :)
I once read a nice article by Tatarchuk. Not that technically advanced, but nice nonetheless. I wonder why she (did she?) fucked up at BP, as she obviously knows more on the subject than like, well, most of the others there :)
guess :)
I once read a nice article by Tatarchuk. Not that technically advanced, but nice nonetheless. I wonder why she (did she?) fucked up at BP, as she obviously knows more on the subject than like, well, most of the others there :)
Eh, that's "Stefan" of course.
stelthz: you seem to be forgetting that the card hangs upside down, so that the hot air cannot escape from the cooler by itself. that's why it needs the fan.
now you can look up what Zalman's heatpipe looks like. it is 2 radiators, one above and one below the card. they are connected by a pipe with a fluid. i'm somewhat unsure how efficient that fluid is, but the best thing it can do is make temprature on the upper and on the lower radiators nearly equal. so the upper cooler takes most of the cooling job. it should be able to create convection. but imagine how much better it would be with the radiator hanging off the card side! and its thickness would allow it to transfer heat effectively, and then would also increase the surface on which the radiator contacts convecting air. these days there are few other cards in pcs, there are holes along the side or the back, and so it would effectively suck air through itself. i might even build one if i get a card, and tell you what it's like. you only have to imagine how simple and genious that is. This would work particularly well in my bigtower computer where i have a fan sucking in the air at the front, and 4 pushing it out at the rear, all noiseless compared to the ones on the gfx card. ;)
i think there are no thermal designers, they just stuff up something at hand and look that the chip is not overheated - at least on cheap cards. on others, i have even seen mainchip radiator constructions which make memory overheat. ;(
now you can look up what Zalman's heatpipe looks like. it is 2 radiators, one above and one below the card. they are connected by a pipe with a fluid. i'm somewhat unsure how efficient that fluid is, but the best thing it can do is make temprature on the upper and on the lower radiators nearly equal. so the upper cooler takes most of the cooling job. it should be able to create convection. but imagine how much better it would be with the radiator hanging off the card side! and its thickness would allow it to transfer heat effectively, and then would also increase the surface on which the radiator contacts convecting air. these days there are few other cards in pcs, there are holes along the side or the back, and so it would effectively suck air through itself. i might even build one if i get a card, and tell you what it's like. you only have to imagine how simple and genious that is. This would work particularly well in my bigtower computer where i have a fan sucking in the air at the front, and 4 pushing it out at the rear, all noiseless compared to the ones on the gfx card. ;)
i think there are no thermal designers, they just stuff up something at hand and look that the chip is not overheated - at least on cheap cards. on others, i have even seen mainchip radiator constructions which make memory overheat. ;(
Generally I dont think its a good think going up on stage, acting arrogant, when nobody knows who you are, what you have done etc. And then start bashing the people in the hall, without having a clue who they are, say they dont know a shit and at the same time spreading some bullshit hype words as if they where at Assembly or something. And what did she show? The *worse* way of doing shaders at all -- in that horrible ATI program wich you seriously dont make shaders from the ground in, tho its very nice to rewrite them for RenderMonkey (is that what its called?) and then let the artist fiddle around with it.. But i prefer having materials that merge better with the leading 3dprograms from the beginning.. But thats just my oppinion, but still... I haven't heard ANYONE who codes on the PC that found that performance anywhere enjoyable exept perhaps that everyone where bashing on her ;-)
Quite frankly, acting "arrogant" on stage and bashing people in the hall... wasn't exactly what a lot of sceners were doing at breakpoint? (At least partly for fun. That's what she was doing, basically. I can't believe some egos were bruised in the process.. at least some hacked like they were pissed :)
acted, not hacked.
whatever :)
whatever :)
Don't blame my Natasha :(
"Quite frankly, acting "arrogant" on stage and bashing people in the hall... wasn't exactly what a lot of sceners were doing at breakpoint? (At least partly for fun. That's what she was doing, basically. I can't believe some egos were bruised in the process.. at least some hacked like they were pissed :)"
Its quite different when a non-commercial guy stands up and bashes AMIGA ROCKS and when the leading GPU developers stands up and talks bullshit.. I however, would love to see a GOOD seminar/whatever by the ATI folks, it could be madly fun acctually, but instead we got this.. I find it sad, esp for ATI, who I dont think managed to spread any real info, just acting annoying.. Maybe it was fun if you could see her, but i couldnt , and didnt care quite frankly :)
Its quite different when a non-commercial guy stands up and bashes AMIGA ROCKS and when the leading GPU developers stands up and talks bullshit.. I however, would love to see a GOOD seminar/whatever by the ATI folks, it could be madly fun acctually, but instead we got this.. I find it sad, esp for ATI, who I dont think managed to spread any real info, just acting annoying.. Maybe it was fun if you could see her, but i couldnt , and didnt care quite frankly :)
from all the crap she was getting from the audience, if she didnt act arrogant she would have had to step out of the stage :) it's not easy talking to an audience of drunk sceners about what you can do with shaders, both having to explain what you can easily do with shaders technology to the non-coders, and explaining how you can excel at it without hurting any leet coders ego. she failed abit on the second part but i think she did pretty well overall, all things considered..
She wasnt getting bashed before she started talking bullshit >)