Not sure what Intel has to do with display technology; they don’t make LCD’s, they don’t make the OS that would drive them, and their best HD video chipset isn’t even as capable as a 15 year old nVidia TNT or ATI Rage FX…
Notice how normal desktop users with 24" or larger displays are oddly enough missing? Part of the continuing trend of telling normal desktop owners to go *** themselves; The “all in one’s” being particularly insulting in that department.
As I said recently, Microsoft has been geared up to support this sort of thing since Windows 3.0, and have ALL the needed hooks in place to seamlessly support increasing display resolutions – Particularly on Win7 where they really refined how it works when you crank the enlargement up to 50% or more. Old applications/designs will simply get rasters auto-resized larger and fonts sharpened. Newer applications like those built in Metro will work
Seriously, take Win7 on a 1920x1200 display, increase the zoom factor to 200%… open up an ‘old’ application that wouldn’t know what that increase is.
Right now the resolutions we have are overkill for most raster images; the only reason to go sharper is for clearer text, so why not have the OS just pixel-double rasters and use the full resolution just for text? It’s what the various apple products do, and it’s functionality that’s been built in since Windows Vista and works great under Win 7.
Not that anyone is using the latter – because as I’ve said, where are the display vendors in all this?
Analytics has data on what resolution your site’s visitors are using, right? I think the web design would still depend on that factor. The only downside would be the font. Okay, on to the retina resolutions. If I’m going to use those resolutions, it would be for watching movies. I wouldn’t use it on a normal day. I think it’s going to be hard having to manually zoom in, i.e. moving closer to the monitor, just to see the letters and the other important stuff that might have become “invisible” due to the large resolution.
That’s why those types of displays would have to be used with OS that automatically enlarges everything 150 to 200% transparently, preferably the latter, by default. All programs would still think they were at a ‘normal’ resolution, but text and vectors would be smoother/cleaner. Just like the retina display iPhone 4S does… where the browser reports to the website that it’s half the actual resolution, simply using the extra pixels to enhance shrunken images and make smoother fonts/vectors.
It’s why the suggested resolution for a desktop display is 3840x2160 – exactly double todays 1920x1080 standard high quality widescreen resolution… Though at four times the pixel count that’s going to put a lot of strain on the GPU and require a lot more RAM than a lot of cards make available.
To be honest, I was a little surprised the crappy PowerVR 540 video Apple licensed for the A4 chip is up to the job. (or should I call the A4 by it’s real name, the Samsung Exynos 3110?)
I hate all-in-ones. Geez. Why do they fail to mention normal desktop users? What, standard people on 21.5", 23" (me), and up? What about multi-monitor users? Seems sort of an incomplete thing, to me.
They don’t mention the Mac Pro (desktop) line because they’ve become economically irrelevant to Apple, unfortunately.
I use a high-end Mac Pro. It’s a powerhouse, no doubt, and I still dislike using a laptop for my main work, so I’ll continue using desktops, be it Apple or otherwise.
All-in-one’s aren’t bad for casual users who have no upgrading needs and don’t need lots of horsepower. They’re not meant for professionals anyway. The only product Apple keeps up-to-date for professionals now are the high-end Macbook Pros.
They never really were – since Jobbo the clown came back the desktop/tower form factor was an annoyance they still offered at noodle doodle prices just to keep the die-hard old apple fans on-board. NOT that I suspect TeyYoyo was referring to the Fisher Price My First Computers. While the article may have been on MacRumors, the focus was on Intel… which is why it didn’t make any sense given Intel doesn’t make displays, and even Apple doesn’t use them anymore for video unless it’s the crappy little mini… and I think even there they stopped doing it; at which point it’s either what does Apple have to do with it, or what does Intel have to do with it? Though I do see they’ve switched some of the ‘pro’ models to HD 3000 only… because when I think “pro”, I think Intel HD integrated graphics.
I’m sorry…
Probably less so than you think for the money… given their $2500 entry model isn’t as capable as a sub-thousand dollar PC build… Much less being able to build more computer than their top of the line model for almost half the price. Unless of course you actually buy the build quality kool-aid. (that as a former Apple tech always left me asking WHAT BUILD QUALITY?!?)
Or until something breaks or burns out.
Which is the real laugh of it given how uselessly pathetic they are spec-wise.
No need to feel sorry for me. My machine is pretty darn good, sturdy, pimped with a REAL graphics card—not a legal one, but a custom modified/hacked card—and going strong.
Who said anything about entry-level models? I said “high-end” Mac Pro model (around 4k), not the lower end, and the top models are powerhouses, which can be hardly argued against. The cost is ridiculous, yes, and I’m not sure I’m going to spend over 4k on a machine again.
I’ve had mine for over four years now and it’s still a beast, so I still have a few years without needing an upgrade.
Interesting you focused on that when I was using the entire range for comparison, hence the sentence AFTER that… Must have been interesting paying 4 Large for a $2000 system.
Off Topic:
Gah, am I not being clear enough or something on my posts? I feel like doing Christ Tucker impersonations…
I focused on that because the entry-level line models are completely irrelevant and have nothing to do with what I said. I don’t disagree with the rest of what you said which is why there’s nothing to argue against.