Results 26 to 29 of 29
May 7, 2012, 02:25 #26
- Join Date
- Aug 2004
- Hamburg, Germany
- 30 Post(s)
- 0 Thread(s)
May 7, 2012, 13:14 #27
- Join Date
- Nov 2009
- Keene, NH
- 23 Post(s)
- 0 Thread(s)
PARTICULARLY on ARM... I'll get into that in a second.
Even simple things like images can consume large amounts of battery; you might have that image nice and compressed to 32k as a JPEG, but when it has to decompress to a megabyte or more of memory on systems that if you are LUCKY have around 128 megs free by the time the OS and browser are in place (and typically as little as 64k) it's going to start destroying the flash drive by swapping to it. This is why on the iPhone some pages render and scroll smoothly, but load it up with images and CSS3 effects and suddenly you scroll, you get a white screen and the device goes off to never-never land for two or three seconds... and in that two or three seconds consumes the same amount of juice as a minute of sitting there at idle.
In a lot of ways, it's like the "tables take too long to render" myth; when tables were fast enough over dialup on a 486/33 in IE 5/lower or Nyetscape 4/lower under Windows 3.1, there is no legitimate excuse to be using their render time as a argument on processors that are three times as efficient per clock cycle at 50 times or more the clock speed!
In that same way, there's no excuse for these devices with processor speeds ranging from 500mhz all the way up to 1.5ghz to be having such major issues with pages -- unless there's something different about the pages (there is, it's why we point at developers and sites so readily) or the hardware itself. We cover the former a whole lot, but let's take a serious look at the latter -- the hardware platform of mobile itself.
I've been playing with several ARM platforms, what with my new 'project' and all, and have been shocked by how inefficient RISC is. For all the claims of efficiency and speed compared to x86, to be brutally frank it ISN'T EVEN CLOSE... Because many things you can do in one operation (and therein one clock) on the x86 takes multiple opcodes to accomplish on arm -- and because there are gaping holes in functionality, you end up having to brute force code a LOT of things. Brute force code is always bigger and less efficient.
A great example of this is floating point. I'm actually having to rethink my gumberoo project because most ARM units don't even HAVE true FPU's... I made this crazy assumption that the 'VFP' (vector floating point) unit was at least FULLY IEE 754 ready, come to find out not only are all the floating point functions provided NOT complete in terms of double precision, they don't even EXIST for extended, and most ARM processors currently in circulation don't even HAVE the VFP installed with them!
That means an ARM processor is less capable at floating point math per clock cycle than my Tandy 1000 SX!!! (since that has a 8087). That's horrifying to even think about... I mean, Intel started making the x87 standard issue on the 486DX instead of a separate chip -- that was what, 33 years ago? Hell, since the Pentium II days onward it wasn't even faster to use fixed point on integer ops anymore thanks to improvements in the FPU.
The lack of FPU means that the poor little GPU needs to be many MANY times faster to make up for it when doing things like OpenGL ES. OpenGL has the type "GLFloat" -- GLFloat is 'guaranteed to at least be 32 bit' -- single precision. Something ARM even with the VFP struggles to handle.
It gets worse though; back when ARM was first being pimped, they constantly compared it per clock to a K6-233 or a Pentium 150... from what I've seen in testing with my own code and things like nBench, they have made ZERO improvements in performance platform-wide. This means that per clock, the Pentium 3 (which PPC nutters have been pissing on forever as being inefficient) blows ARM out of the water... hey, weren't the first gen Atoms built on the P3 Architecture?
... and that's the real kicker, take a good hard look at the Atom Z670 for embedded systems. 3 watts TDP is spitting distance of most 1.5ghz ARM's and it is a full SoC. The external RAM might up the power draw a bit, but then it's at least CONFIGURABLE for more RAM (as opposed to ARM's 'ram, what's that' on the die)... but in the long run devices built with it are competitive on power use because the chip is more efficient per clock and can therin spend more time at idle, or more specifically use less code to do the same task.
Which is probably why some folks are looking at porting Android to x86..
In a lot of ways I think it comes down to the old CISC vs. RISC debate; where the simple fact is RISC is meant for people who write compilers, while CISC is meant for people who write programs... Even so it's laughable since most of todays "CISC" processors have a translation matrix and microcode as internally they're actually RISC... So even with the translation layer and microcode it's STILL faster -- and can be made comparable on power consumption in terms of raw execution?
Makes one go "ok, so why is ARM lagging so far behind?" -- Oh wait, you'd think they were the only name that mattered in mobile when it comes to mindshare... and we can't go after them for a monopoly because ARM doesn't actually make chips, they license their designs to whoever wants to make one. Be it TI, Samsung, Fujitsu, Allwinner, etc.
More of the performance issue with them though would be SO simple to rectify -- RAM. For all the talk of clock speeds ARM devices are still using what is basically PC-133, which by todays standards is some piss poor performing memory; it's all 'on-chip' so expanding the memory isn't even a possibility (since they don't expose the bus in a practical sense) and the amounts they give us are by modern standards anemic. 512 megs is bupkis barely leaving you with a third of it free after a iOS or Android install. Admittedly more RAM means more power draw all the time; but going for swap because you ran out on flash not only means the flash will die sooner, it also mean far, far higher power consumption when you are actually trying to use the device to do things.
Probably why my hunt for a tablet ended with the ICOO D70W. -- because at least it comes with a gig of RAM, 8 gigs flash, a decent resolution (1024x600), and a 1.3 MP camera instead of the 0.3 most of your sub-$200 tablets come with. Ice Cream Sandwich out-of-box was just icing on the cake... ($134.99 at DX -- of course being DX I probably won't see it in the mail until June)
It's one of the outright insulting things about the newest iPad -- on the original the 512 megs of RAM worked out well enough since the slower processor and GPU of that first gen Samsung hummi.... uhm, I mean Apple A4... really I do... in any case the older slower processor and anemic GPU meant RAM was the least of it's issues. The iPad 2 fixed a lot of those issues - in particular it's got five or six times the GPU -- started to actually tax the ram and the newer iOS builds grew respectively.... so did they fix that on the iPad 3? Of course not, it's still coming with the same crappy 512 megs they had on the original model -- great when they've made the jump to dual core meaning you can run twice as much as once.... assuming it all fits in memory (which it won't).
Since these are all 32 bit and the various OS are designed for that, there is little reason to need much more than 2 gigs.... but with many droid devices shipping with 256 megs and apple being stuck at 512k, that can be pointed at as one of the major performance bottlenecks; again more RAM might consume more power at idle, but it would mean more room to work with and more efficient code execution -- and therein lower power use -- when you are actually trying to do something with it.
So I'm with you on that oddz -- the hardware is equally to blame.
May 7, 2012, 23:10 #28
- Join Date
- Oct 2009
- Racoon City
- 1 Post(s)
- 0 Thread(s)
I stopped reading at "BBC"...They remind me of another network in the States.Chuck Norris is so tough, mosquitos ask for permission before they bite him
May 8, 2012, 00:21 #29
Originally Posted by paperchaser
- Join Date
- Aug 2007
- 46 Post(s)
- 1 Thread(s)
Why did you bother posting in the discussion about the message then? "The messenger sucks, I didn't read the article" counts as fluff in my book.