Sunday, June 8, 2008

Flash and Silverlight - kill CPU's !!!!

Recently, i have been using Linux - (Ubuntu 8.04LTS) - (Story on that later) , but i have kept some applets that sense the CPU temperature, something i never bothered on XP or Vista, but linux gets the geek in you out !
Well i observed that if the normal operating temp for both the cores is around 108 F, whenever i start a flash or silverlight app (lets say i go to youtube) and in 2-3 mins of watching the temp reaches around 131F and over an hour it reaches around 151 F ! Its pretty much similar in Silverlight.

That led me to do some more research and on Vista too, i observed the same pattern. Therby, i concluded that its the OS at fault, but its the applications and their very way of rendering that is at fault. Typically if you write a program with an infinite while loop, you end up keeping the CPU busy and a busy CPU (at around > 60-80 % usage - average), it would cause the CPU to heat up.
A Flash application renders pixel, and i guess i had my hardware acceleration off (i would have to test with that on .. ) anyways i am sure i might have had the same effect. But the way pixels are rendered, it seems it keeps the CPU very busy ! There seems to be something wrong or non optimal in the way in which Flash and Silverlight have been able to light up the web - I think in future there will be better ways of doing these. It isnt optimal to have an application try to take 100% CPU for long time, any client or server application with a greater than 60% CPU utlization is very very bad in my opinion.

No comments: