Category Archives: Various Rants

A Benq sw2700pt monitor

Of the importance of measuring the homogeneity when testing a monitor

This post will be about screen quality and how online reviews almost always fail to give insightful information about an important metric: their homogeneity.

If you are planning to buy a new monitor for your computer, no doubt that you will read tons of reviews, hoping they will help you to make an enlightened choice. Depending on your use case, some metrics will bear more importance than others. Gamers will privilege high refresh rates, while graphic artists will be attentive to color rendition and some mysterious things such as color space.

Color accuracy and white point

Nowadays, every specialized tech website will test and report the color accuracy of the reviewed monitor. Which is good, as it was not always the case. The principle is to measure the deviation between the colors that were asked to be displayed from the colors actually displayed. That is done by using an independent and accurate hardware device: a colorimeter.
Here is the kind of report produced.

A screen accuracy report

These values are quite good!

Usually, we admit that, for a particular color, a DeltaE < 2 is invisible to the naked eye.

Another important measure is the white point, expressed in Kelvin. It is the color temperature of the illuminating source. How does that translate in real life? On a bright sunny day at noon, the “white light” received on the ground lies around 5400K. On an overcast day, it is closer to 6000K or even 7000K: the light is bluer. An old incandescent light bulb emits around 2800K, which is a very warm (=yellow) light.

Thus, when a photographer sets the color temperature to 6500k, he is wishing that the rendering of his photos approximates what they would be if printed and watched under a cloudy sky.

The colorimeter can also measure the deviance of the screen from the intended color temperature. It tells me that it is around 45K far of the target (less that 1%).

Given these two reports, this screen must be close to perfection!

IT IS NOT!!!

Where are the homogeneity results?

The problem is that the colorimeter was placed at the center of the screen. Thus, all you can know by reading those reports is that this monitor features very good color accuracy in the center!

What’s the situation in the corners? If deviance is too high, you may see a color shifting when filling the screen with white pixels! Fortunately, my colorimeter software also allows me to measure the color temperature at nine different places. And the results follow:

White point report

The white point is perfect dead center. Not so much in the sides and corners.

In this report, there is a 200K difference between the left side and the right side. This is enough to be perceived with a naked eye!

And I could not measure the green / magenta tint deviation, which I can easily notice on the unit I tested.

Why is this information missing in most online reviews?

Why do the reviewing sites spend time to measure color accuracy, but never uniformity? I don’t know. It may be because moving the colorimeter is a manual operation. But without a measure of homogeneity, the provided information is incomplete.

Don’t bear me wrong. It is incomplete, but not useless: it gives a clue concerning the general quality of a screen. But if color accuracy matters to you, next time you’ll be in search for a new screen google a bit about the uniformity before purchasing. Some screen are given very good reviews across the board, but are a disappointment to many photographers.

Here is a catastrophic sample of BenQ sw2700pt. The colors are as they appear when displaying a white page!

Magenta Green

A very bad unit, but hey, it is accurate in the center!

The BenQ sw2700pt has a real problem of homogeneity on most units. But if you only read one or two reviews, and don’t crawl the personal blogs and specialized forums, you may never know before buying one and turning it on!

Shadow and the future of Cloud Computing

Shadow

Introduction: The end of Moore’s Law

The Moore’s Law will soon hit a wall. Or perhaps not.

But it does not matter as we have entered, for more than a decade, into a zone of diminished returns. Twenty years ago, things were simple. Thanks to a continuous progression of frequency and efficiency, computer processing power doubled every two years. In 1983 my personal computer was an Apple II with a mere 1MHz 8 bit CPU, in 1994 it was a 90MHz 32 bit Pentium and in 2004 a 2GHz 64 bit Athlon.

But the frequency progression came to an halt. Twelve years later our desktop CPU cannot easily sustain running at 4GHz. Efficiency also came to an halt. Even a very wide CPU, as the Intel Core, can only process 1.5 instruction per cycle on average.

The industry transitioned to multi-cores. Now even phone CPUs are at least 2-cores. But desktop CPUs seem to have hit a 4 core ceil. The reason? Except in very specific cases, it is very hard to develop an efficient multi-threaded application.

So what now? Some predict the advent of “Elastic compute cloud cores”, which is a neat name for “Hardware as a Service” (HaaS). And that’s what I will discuss in this article, through the prism of Blade a young French startup that claims to have achieved some breakthroughs in the field of cloud gaming!

Cloud Gaming?

Cloud Gaming is a very peculiar subset of HaaS. Instead of running your video game on your home console or tablet, it is run on a distant server, in the “cloud” and the resulting frames are streamed to you. Your device is, thus, only used for inputs and display.
Two of the most famous services are Geforce Now from NVIDIA and Playstation Now from Sony.

But cloud gaming is also one of the most difficult area of cloud computing. Games require a tremendous amount of processing power and more specifically, they require to run on a powerful GPU. And last but not least comes the latency. In order to be playable, a game requires a very low latency from the moment you input your orders, to the moment the result is displayed on the screen. As Internet was not designed with low latencies in mind, this aspect is not easily tamed.

The two services mentioned above almost achieve to solve these problems. Almost…
Playstation Now will run only ancient games, requiring less computational power. And Geforce Now is capped to 1080p. Plus, according to reviews, it achieves latencies in the region of 150ms. Quite good: the games are playable. But this small latency can render the game a bit jerky if you are as attentive as some gamers affirm they are.

Enters Blade

Blade is a young French startup claiming that, thanks to its brand new patented technologies, cloud gaming is now a problem solved! (Note though, that those patents are still pending and not yet public).

If true, their most impressive achievement is the latency induced by their solution: less than 16ms according to their website!

Concerning this latency thing

Let me define what “latency” refers to. It is the time spent between your inputs (ie. a button pressed on the controller) to their effect on the image displayed on the screen.

If your gaming system is a home console, such as a Xbox or a Playstation, the latency is the sum of the time for your input to be interpreted by the software (negligible), the time spent to render the resulting 3D image, then the time spent to send it to the TV (negligible).

But if you’re playing on a distant server, things are not so simple.
First, the data corresponding to your input travels to the server. It goes through your ISP then many routers and servers before arriving to its destination. Internet was not designed as a low latency network…
Then, just as on a home console, the software computes the corresponding image.
But before being sent back, it has to be compressed or the bandwidth requirement to send the stream of images would be unreasonable…
Then back to home. With more or less the same number of routers and severs laying on the way.
Then… the image has to be decompressed! Before finally being displayed on the screen.

 

Sources of latencies

Sources of latencies

Blade’s answers to latency

A few days ago, I was invited at Blade, were I could ask many questions in order to try to understand how they tackle these problems. I will explain my understanding of their technology and what I extrapolate. Because of course Blade did not share all their secrets, so I have to try to fill the blanks…

The network

Blade will require its first customers to have fiber. Cable subscribers will come second. But there will be no support for xDSL users. And over 4G? I don’t know, but such mobile access does not appear to be their priority right now. Fiber offers very low latencies compared to DSL, so it helps a lot.

But they are also restricting their very first user base to France. That allowed them to deal with the four major internet providers in the country and to directly link their network to those ISPs’. This limits the number of hops required for data to travel from most French subscribers to Blade’s servers.

They also claim to have some patents pending concerning the network side of thing, but I could not gather much info there. As it is far from my field of expertise, I will not try to guess what those patents could cover.

The video compression

I was eagerly waiting answers for about video compression, as a plain H264 encoding is not adapted to low latency streaming. Indeed, the main purpose of H264, when it was designed, was to compress movies. In this case it does not matter much if, in order to achieve the best quality/bitrate ratio, the coder has to bufferize many frames, producing 100s ms of latency.

In that field too, Blade claims yet undisclosed pending patents. They told me were using an heavily tuned H264 coder. So much tuned that most hardware decoders are not flexible enough to handle the video.
I also can imagine that they use an open-sourced low latency audio codec. For instance, OPUS can go as low as 2.5ms.

In order to decode their stream in the best condition, they will also sell a small “box”: the Shadow. Its size is comparable to a Raspberry Pi and was designed by Blade, using off the shelf components: no custom ASIC or FPGA here. The Shadow is powered by Linux and connects to the stream as soon as it has booted. This ways user never see the actual OS and is given the illusion to use a Windows box: the distant computer.

A software client is also available on Linux, Android and Windows. But the 16ms latency will only be sustained on the Shadow. As a matter of fact, buying the Shadow will be mandatory to their subscribers!

The server

The server’s hardware is also crucial. If a game runs at 60fps on it, a frame will take 16ms to be computed. Blade would then miss its target of 16ms for the whole latency. Thus, the hardware has to be fast enough to run your game way faster than 60fps! It has to run it at maybe 120 or even 140fps!
As on a gaming computer the most crucial element is the GPU, Blade was not shy and its servers are equipped with the latest Geforce 1080! More, each user gets the full benefit of a GPU: they are not shared between users as on competing solutions!

The magic behind the curtain allows to instantiate a virtual gaming machine on the fly, consisting of a virtualized CPU and its main memory, a networked storage hosting the user’s private data, and a physical GPU for the user’s sole usage.

Actually, as a customer, you rent such a virtual machine and the server will instantiate it when you need it. Your data are also kept on Blade’s servers and are, of course, persistent.

The demos

I admit that I have limited knowledge in the field of networking and virtualization. So, although Blade’s developers could provide me convincing answers about their pretension to keep latency below 16ms, I was waiting for the demo to form my opinion.

They currently have two demos in place. The first one is involving two cheap netbooks. One is running Blade’s solution, the other one is the plain netbook. The second demo consists of playing Overwatch on their Shadow hardware.

On the netbooks

The goal of this demo is to use the two netbooks side by side.

The user should not notice any difference on light works, such as handling local files using explorer or checking mails.
On this occasion I managed to guess which computer was displaying the stream from the cloud. By moving the scroll like a crazy, I could detect a few missed frames. Admittedly, this is not a representative use case: I actively tried to provoke those lags.

Next, they launched Photoshop. A gaussian blur filter was to be applied on a hundreds megapixel photo. Of course, the real netbook struggled, while the (quite demanding) task was way faster on the distant computer.
On the plus side, the lag was low enough that I could not notice it when moving the mouse pointer and going through the menus. On the minus side, I could again notice some lags and some compression artifacts when I did clicked undo/redo. This is a more realistic use case than what I did before to provoke lags. Indeed, on a powerful computer with plenty of RAM, undoing (or going back in history) should be near instantaneous. When working on my photos, it is part of my workflow to go back and forward to evaluate the effect of the various filters I happen to apply.

Finally, I ran the latest Futuremark’s benchmark on both machines. Of course score of the “true netbook” was pitiful while the one of the cloud computer was stellar!
But, once again, I could spot the difference. In the corners, there where some compression artifacts (lightly blurred blocks) appearing on the streamed video.

Bear in mind that, although the cloud computer could not be 100% undistinguishable from a true local computer, the demo was convincing enough. And deported heavy applications such as Photoshop are quite usable! As I know a thing or two about video compression, I knew exactly how to stress their video encoder and where to look for the result. It is doubtful that an average user would spot the difference.

On the Shadow

Remember that Blade developed a custom hardware. As they are working full time on it, they also told me that the software solution running on the netbook was not up to date. Thus I was expecting an even better experience on the Shadow!

This time, the demo consisted of playing a game of Overwatch. FPS are the most demanding games so this is quite pertinent. As before, I tried to provoke artifacts by jumping, turning around and teleporting like a crazy. But this time I could not spot any!
As a matter of fact, I could not feel any difference with playing on my own computer! The graphic settings where 1080p / Ultra.

This time I was 100% convinced! 🙂

So much that I unplugged the Ethernet cable to verify that the game was indeed streamed 😀

The future of cloud gaming?

Although I cannot certify that the latencies I experienced where indeed lower than 16ms, Blade’s devs could provide me credible answers to my questions and I was convinced by the actual experience. On their hardware device at least.

I’m quite sure that they’re technologically onto something.

I’m less sure about what they unveiled concerning their business model.

They plan to target hardcore gamers and sell them a monthly subscription to their service. The price is still to be disclosed.
Blade will also require their customer to buy their hardware, including a Windows 10 license that will run on the cloud computer instance. Thus expect something around 200€.
And don’t forget that all you’ll get is an access to a cloud-computer: gamers will still have to buy their games. It is not a “Netflix-like” kind of subscription, offering a large library of games to be streamed.
I think that’s a large amount of money. Even if the gaming experience is premium compared to the way cheaper competing solution.

But investors think to believe in them, so I shall be wrong. Just wait and see…

The future of cloud computing?

If have my doubts about the viability of their announced business plan focusing on cloud gaming, their technical solution opens a window on something else. Remember the demo when I tried Photoshop? It was quite usable indeed!

Nowadays, customers privilege light and practical computing devices. Unfortunately, those are too weak to manage heavy computing tasks. And they will stay too weak for the years to come because, as you may remember, the Moore’s Law is dead!

Enter cloud computing. If, in time of need, those customers could easily instantiate a cloud computer such as the one I ran Photoshop onto, the problem would be solved!
Of course, the pricing could not be the one Blade presented me. Maybe a price depending of the usage? And there is the problem of  Windows license which is tied to the instantiated personal cloud-computer…

But the perspective are quite fascinating!

Using Lightroom on Desktop and Laptop: How to Sync catalogs and photos on two computers

The main goal of EZWebGallery, my main pet-project, is to offer an easy way to publish photo galleries online, meaning that I am interested in the process of producing photos 😉

My main tool for sorting and editing photos is Adobe Lightroom. I also use Adobe Photoshop from times to times if I need to perform some more complex modifications. Things were easy for me has I only worked at home on my desktop computer.

But it appeared that I would gain much by also have a mean to work on the go so I bought a Macbook Pro. And then the work-flow problem aroused: How can I edit my photos both on my desktop and my laptop computer?

Searching the Web, I understood that it is a very common question, but most of the answers proposed did not satisfy me.

Most involve using an external usb hard-drive (slow) to store the photo and access them from both the desktop and the laptop. And at the same time using two “Catalogs” (Lightroom’s database): one on the internal drive (fast) of the desktop and the other on the laptop. This appears to introduce a lot of headaches that I don’t want :/

Another, more interesting proposal, was to store both the Catalog and the photos on a cloud storage, such as Dropbox. These files would always be kept in sync by the cloud service. Thus I could work on the two machines without thinking about the details and, and get rid of a slow and clunky USB drive on the road!
But there are two drawbacks.

  1. The cloud storage might prove expensive. For instance Dropbox charges no least than $100 a year for 1 TB of data!

  2. The files must transit by the Dropbox server on the Internet to go from a machine to the other, even if both are on the same local network and separated by only a few inches! On an ADSL line the initial synchronization would take days!!!

Cloud-Service

Using the services of a cloud storage provider is not a solution!

 

Enters Syncthing

 

Syncthing is a peer-to-peer synchronization application: it can be used to synchronize files without the use of a central server (aka cloud service)!
Plus, it is open-source, meaning it is free!
Plus, it is multi-plateform, meaning that I can install it both on my Windows desktop computer and my MacbookPro.

So that is what I did, and it works exactly as intended!

As soon as my two computers can reach each other, they begin to synchronize the catalog and the photos. The initial synchronization, for a few hundred photos, is a matter of minutes. The following ones, reflecting the edits performed on the photos, is a matter of seconds!Syncthing Simple

Now I can work on one machine, wait around a minute, and continue my work on the other one. It’s seamless, and it’s great!

Syncthing is not limited to two machines, so you can also sync your photos to a backup server or to a third PC. It’s very flexible.

In fact, after a few days of experimentations, I devised a better use of Syncthing, to fulfill my needs!

Usually, I will import my photos and perform the first edits on my laptop while on the field. The (minor) drawback of the PC / laptop synchronization scheme I presented above is that once at home I have to turn on both machines to allow them to get in sync. It would be convenient if I not had to… And it’s quite possible!

In complement to my two computers, I also own a NAS (a small storage server) in my house and a server somewhere on the internet (powering this very website!)

I also installed syncthing on them and added them to the “network”. The fact that these two machines are always ON and in reach of each other is very convenient!

Now, as soon as I have an internet access on the field, my laptops synchronizes itself with the server, which in turns sends the photos to the NAS in my home.
Back home, all I have to do is power up my main computer and wait a few minutes before resuming my edits!Syncthing

Hosting a private VoIP server on a Raspberry Pi

Have you ever heard about Mumble? It is an open-source low latency chat software, primarily aimed to gamers. I gave it a try lately and was really impressed by the audio quality.
So much that I now use it way more often than “traditional Voice over IP softwares such as Skype or Hangout!

But there is a catch… You have to agree with your friends on a public server to join. There, every other people who may join can participate to your conversation… It’s an audio version of the old public chat-rooms you may have used 15 years ago!

Fortunately, it is possible to host his own server and make it private. My always on Raspberry Pi, which is mainly used as a BitTorrent Sync server, is the perfect device for the job!

First, download and install the mumble-server package.

[cci_bash]
apt-get install mumble-server
dpkg-reconfigure mumble-server
[/cci_bash]

It will of course work on Raspbian, but also on Moebius, the very light and headless derivative running on my Pi.

You can now edit the configuration file:

[cci_bash]
nano /etc/mumble-server.ini
[/cci_bash]

You can now configure a required password to join the server, or change the UDP and TCP port listening. By default, it is 64738. Make sure it can be reached from the Internet 😉
By the way, if you want to generate a random password, give a try to my password generator 🙂

Once done, restart the daemon.

[cci_bash]
/etc/init.d/mumble-server restart
[/cci_bash]

You can now privately chat with your friends! Enjoy 😉