Thursday, May 1, 2014

4k, okay?

All but the youngest among us have watched the steady march of ever higher computer display resolutions, from pre-VGA, through the [something]GA's, and on to the newest: 4k.

In the interest of full disclosure, I am a grammar-ninny (transitive vs. intransitive verbs), a capitalization ninny (Mb vs. MB), and an acronym ninny, so let me start there.

4k monitors are not 4k.  "4k" is a term technically linked to the cinematic resolution 4096x2160.  However, "4k monitors" are quad-HD (officially dubbed Ultra-HD, or UHD)  which is 2x(1920x1080), or 3840x2160.  

BEGIN: excited rambling

Why am I interested in UHD?

I have been using computers for a long time, and I use them as much or more now than ever.  My career is full of them, my hobbies mostly revolve around them, and now I pay most of my bills, plan my vacations and communicate with friends and family on them.

One recurring theme has persisted throughout my computer years: I need more screen space.  From wanting a bigger TV for my Commodore 64 (yes, THAT long ago!), to bigger monitors, to more monitors... I just can't get enough screen real-estate to sate my need for it.

When I bought my last--and current--computer, I purchased 3 ASUS WUXGA (1920x1200) monitors to go with it.  Later, I added two more similar monitors in a five-screen configuration at my home office.  This, for me, was screen-space nirvana, but in the last few years, it has become so much more of "not enough," (not to mention the neck strain of looking 'up' at two of the monitors).

Enter the UHD (tragically a.k.a, 4k) monitors.  I salivated when I saw the first of these in the latter 2000's at a research facility where I worked.  I watched as they became a consumer reality, and I seriously weighed the cost/benefit of spending $10,000 on one when that was the going rate.  I followed them down into the $3k range, all the while knowing the ever increasing pixel density was running an opposite course to my eyes' ability to perceive it.

And then it happened, just last week.  A well known, reliable OEM released a UHD display with a price point that even I could not resist.

Oh my god, what have I done?

Somehow, through a magical mix of fate, luck, and regular Googling, I found Samsung's U28D590D, 28 inch, UHD monitor for sale in the $600-$700 range.  I'd seen Dell's similar offering, priced at only $800, but the 30hz refresh rate was a show-stopper for me.  The Samsung part has a full 60hz refresh rate.

Before I knew it, I had read every review available for these monitors, watched videos of them, and ordered three, complete with express shipping.

But that's not all.  I am a Windows user, and my 2010 "best of the best" PC used an AMD HD 5870 GPU, which does not support UHD resolution, so I also ordered two R9 280x GPUs to power the new displays.

I love pixels, but I digress

I'm a programmer.  I've done a little of just about everything, from drivers, to patented network stacks, to business software, to image processing, remote access, and physics.  I enjoy it all, but what I really, deeply love is graphics: high quality, interactive, math-intensive graphics, such as work with OpenGL.

In this regard I would consider myself a creative person.  I'm not an artist, insofar as I can't create imagery from vision, but I love to make pixels do fancy things, I love to find new ways to make computer visuals interesting.

On that note, I would say that anybody who is similarly positioned is aware of two parallel and wonderful trends in technology:

1. The science of how to display pixels

Let's face it, everything is discrete, and vision is no exception.  From the inception of the first discrete (pixel based) display, there has been a constant and rapid trend towards better pixels.  Today UHC, and other technologies like HDR (High Dynamic Range, displays that can show more contrast than your eyes can perceive).  In the very near future, your monitor will be able to fool your eyes into thinking what it sees is reality.

2. The science of how to color pixels

Color photos were a major breakthrough for computers in the early 90's.  Since then the issue has been more about how to generate psudo-3D images in a believable manner.  The state of the art has been two different approaches for decades: polygon meshes for live, "interactive" graphics, and ray tracing (physics-based simulation of real light) for pre-rendered content.

But all of that is about to change.  Every couple months someone releases a latest contender for a real-time ray-tracer that runs on typical consumer hardware, and though they have all fallen short of "good enough," the gap is closing, and we will soon see a giant leap in rendering quality.

So what?

One of the first things I did when I got my new monitors was watch video content specifically generated for UHD.  Some of it was spectacular.  Unfortunately, most of it is terribly done, with a tight focus so that you spend the entire video looking for the "high resolution spot," and being put-off by the [blurry] majority of the scene.

But (yes, I started a sentence with 'but'), having seen content at this resolution, and being a creative person, I am positively inspired to look for new ways to capitalize on the higher resolution to create new and more immersive experiences.  I have been writing a video game in my spare time, and now that I have experienced UHD, I think I will have to remake my textures and other assets so players with UHD hardware will get the experience they deserve.

END: excited rambling.
BEGIN: My
 UHD experience

I got 'em, they're cool. The end.  I wish it were so simple.

Windows vs. Mac

I'm a Windows user.  From what I am seeing online, most current Mac's will not support UHD monitors well, so if you use that OS, check carefully before dropping money on a UHD display *.  Of course, when Apple releases 4k for the iMac's and Macbooks, they will be exquisite and work perfectly, because that's what Apple does.  I think right now, only the new Mac Pro's support UHD.

* EDIT: Having discussed Mac 4k support, it seems many of the late 2013 Mac's do in fact support UHD displays.

Windows vs. UHD

It seems simple enough: buy new monitor, plug in new monitor, enjoy awesomeness.

In Windows 7, Microsoft included a feature that allows you to scale text and other items up.  I have used this feature while setting up computers for my parents' generation, and always thought of it as a feature just for them, or for rarer use cases such as POS systems or people with severe sight issues.

No longer.

Samsung's UHD display is just 28 inches.  At that size, it has a pixel density of 157 DPI, and unless you use this scaling feature, things are very, very small.  Consider that the physical displays are only 10% larger than my previous ones, yet they have 100% more pixels (in both directions).

Scaling is not nirvana

Unfortunately, while Windows can control almost all things like title bars and standard Windows controls, it cannot force applications to use the system scaling feature properly. For example, I use three browsers on my PC:

Chrome: tabs, UI elements and content all scale properly, but text is fuzzy.
Firefox: tabs, UI elements and content all scale properly, and it is crystal clear.
Opera: Nothing scales; it is all tiny.

Some apps are just bad.  Core Temp (used for monitoring CPU temperatures) puts numbers in the system tray; these are just eight pixels tall, and looked quite good at ~94 DPI.  At 157 DPI, they are barely 1/20th of an inch--this is too small.

Another interesting use case

I use Windows Remote Desktop to access several remote servers, and I access them from several locations (home and external office).  Resolution was never an issue, as all of the PCs I used had 1920-ish displays.

Now, however, when I access them using a UHD monitor, everything is very small, as the remote PC's OS is not set to scale.  If I adjust it to auto-scale, then when I access them from the 1920 displays, everything will be huge.  

This is a minor inconvenience, but worthy of mention, as a glimpse at the growth pains inherent in such a technology change.

Bulging eyes, at first

The first hours with UHD were exciting, but I found my eyes strained from the tiny screen elements.  However, over the next few days, as I played with color, contrast and brightness settings and made peace with the auto-scaling, my eyes relaxed and I am now as comfortable as I once was.

Also, whereas I originally thought three UHD displays would give me about as much real-estate as 12 HD displays, I now realize it is something much less, perhaps in the 6-9 range, as some space is lost to up-scaling.  Of course this is a matter of preference; if I had Super vision, I could leave everything at a mere 100%, and I would indeed have 12 HD display equivalent.

Final thoughts

If I had the desk space, and if price didn't matter, a 30-32" display would give me a more satisfying experience, but this is because my use is mostly productivity: developer tools, browser windows, Remote Desktop sessions, Virtual machines, chat.

However, for gaming or video, the smaller size is great because the dot pitch is so high.  I can still perceive the limit of the resolution in videos, leaving me to believe we have at least one more doubling of desktop resolution coming down the road.

With or without future generations of higher resolution displays, the days of designing screen elements in pixel size are gone.  Windows has not caught up to newer, smaller OSes such as Android, where it is possible to specify sizes in screen-relative and often user-controllable units.  Once Windows catches up on that, higher resolution won't necessarily mean any size difference at all, but only a change in smoothness.  Note: it is possible to use relative sizes in Windows, but the OS provides relatively no support for pre-existing apps to auto-scale without developer intervention.

I am happy with my new monitors.  I look forward to OS updates that help me enjoy them.

Oh yeah, what about these specific monitors?

Priced at between $600 and $700, the Samsung U28D590D is by far the most affordable UHD display, and what's more, is ideal for gamer's with it's full 60hz refresh rate (when powered via Display Port).

However, it is a TN panel, which means it is older display technology and does not have the super consistent color behavior that newer IPS and other technologies have.

For me, this does not matter, but to be honest I do see the difference.  As I write, this browser window is positioned in the center of the front monitor; to the lower right is another browser window, and right next to that, on another monitor is Skype. They all have white backgrounds, but the window at lower-right has an ever-so-subtle warm tint to it, because my viewing angle is different towards that portion of the screen.  If I had just one monitor I would not have noticed, but right next to that browser is the Skype window on the other monitor which, at it's angle to me,  looks whiter. I can change the slight warmth difference by moving my head back and forth.

This review speaks about this, and how color-minded professionals might still find a way to enjoy the UHD experience:  http://www.youtube.com/watch?v=X5YXWqhL9ik

Brightness and consistency

These monitors are bright. It took me quite some time to fine-tune the controls to get a comfortable balance.  I also found quite a difference in color saturation from top to bottom (as mentioned in the review linked above), but I had no problem making adjustments to largely eliminate it.

However, as for consistency, I am impressed. I bought three of these at once, and have them placed side by side.  I have done solid color tests and more, have found zero dead pixels, and find all three to be identical, giving the exact same appearance when using the same settings.

One final note

After looking at many videos and comparing HD with UHD by either watching two versions of one video side by side, or by down-scaling one monitor, I have become "well tuned" to recognizing UHD content.  Youtube videos available in 4k, for example, begin in a lower resolution and then suddenly switch during playback.

Yesterday, I walked a few blocks to buy lunch.  It was a beautiful, sunny day, and on my way back I happened to look up, past a wonderful old building and through some trees.  Suddenly, my mind thought "UHD," and I swear I could tell which detail I would be able to see of that scene on a UHD monitor, and also what detail was too far or small even for this resolution.

It may be that I need a few days away from my new monitors :-)