CodeBlocks

Thursday 22 May 2014

Steam In-Home Streaming

A few hours ago Valve released the once beta only Steam In-Home Streaming to everyone. Steam In-Home Streaming allows one to remotely play a game via any machine that supports the steam client. That it and it is really that simple.

How it works
Now you might think there is some magic at play with this but in reality the implementation is very simple. What In-Home Streaming does is create a Client Host setup the host being the machine that does the heavy lifting and the client receives the end result of said heavy lifting.

In-Home Streaming will launch the game on the host machine as if someone is physically going to play on it, it then capture the monitor view(not the end product of the GFX), then compresses it with H264 and video streams it to the client. This means that if you alt-tab the Host machine the desktop will still be streamed to the client. So really the client machine becomes a duplicate display of the host machine. 

This does have some drawbacks.

How to setup
Setting it up is really simple
  1. First install steam on both host and client devices make sure both are running the latest version.
  2. Run steam client on both machines
  3. Log into the same account on both steam clients
  4. On the client machine select the game and click stream.
Note the host machine HAS to be logged into a user account.

How well does it work
Now this will all depend on various factors. Since there are 4 bottlenecks at play
  1. Rending time - The time it takes for the host machine to render a frame
  2. Encoding time - Time it takes to compress frame to H264
  3. Network - The time it will take to correctly send the Frame data
  4. Decode time - The time it takes to decode a Frame
Now I tested this on both a wired and wireless machine that  would in no way be capable of running the game. The performance does depend on the games some games stream better than others, I'm sure CPU intense games will suffer more due to the H264 encoding overhead.

The games I tested were NFS Hot Pursuit and HL2, on average the wireless connection did appear to drop frames from time to time which in NFS was kinda annoying, it did perform well it could just... be better. The wired connection performed better dropping less frames due to network issues and overall performed really well and very playable. You do from time to time notice a latency issue but overall it worked really well and very very simple. 

Conclusion 
The approach Valve went for is a very simple but limited implementation of game streaming that for advance user might be a bit of a crux as you can't really create a server to host games to the level some would want to. 

You also can't change the codec settings which for me at least is a very lame, I'd love to be able to play around and see how influences the gameplay.

In-Home Streaming does what it says and it works very easily with very few hickups which is great for quickly setting up a home entertainment system, but it lacks advance options and the ability to scale beyond a mere slave for the client, the client essentially renders the host machine unusable.

Saturday 18 January 2014

x264 CRF Guide

Along with x264 came a new rate control method Constant Rate Factor(CRF) to me this is one of the best ways to encode videos. Unfortunately there isn't a lot of data on how CRF actually work and the results it creates. The reason for this is because it is based off of Constant Quantizer(QP) to understand this one must know how QP works first.

Constant Quantizer(QP)
Quantization works by breaking large input sets into smaller ones by removing information that might not be as relevant. A small example of this is, instead of storing data like: 28.56 we store it  28.56204234672.

So QP work by keeping the information it throws away consistent, thus resulting in a constant quality. In the H263 (xvid,divx, mpeg part 4, 3ivx) days a QP of 1 = 100% quality and every step was 5 less, so a QP 3 would be 90% and QP 5 = 85% and so on.

In H264 a QP 0 = 100% and there are 51 steps suggesting that every step is roughly 2% quality loss of course that is all mathematical and in the real world we deal with perceptual. And that is where CRF comes in. 

Constant Rate Factor
CRF works similarly to QP in that it tries and mimics the perceptual quality output of QP, such that a QP 21 would look to the human eye near identical to a CRF 21, it does this by reducing information in areas the human eye can't perceive and redistributes them in other areas that might need more data.

CRF-25
QP-25
As you can see the images have a very similar image quality even during playback but has a significant difference in file-size:
QP: 2.6 GB
CRF: 2.1 GB(24% smaller)

Resolution scaling

So lets take a look at how CRF scales with different resolution sizes. The y-axis represents file size and the x-axis the vertical lines from 96x52p to 1920x1040p. The first step is 5% pixels of max and the second is 10%, the 3rd 15% and so on until 100% of/or 1920x1040.
As you can see the there is a slight indication that the amount of data needed to represent an image with CRF scales slightly exponentially in regards to the number of pixels.
And if we look at the encoding performance above we can see that as the resolution increases there is a sharp drop in encoding speed that tapers of has the resolution reaches the higher end.

So what is a good CRF value? It mainly depends on what you want, if you want a near identical copy then 18, High Quality 19-21.

But lets see how various CRF values impact the compression of a file. y-axis is file-size and x-axis is the CRF value
Again an exponential trend occurs, the lower the CRF value the significantly larger the file-size. As for image quality its hard to see the exact difference between still images when you compare the visual quality of CRF value with each other, you only notice a significant difference when its in motion. However I would suggest no higher value than 25.

CRF 21

CRF 25

CRF 30

Sunday 5 January 2014

Hollywood still uses film

It might come as a surprise to know that the majority of major hollywood films are still shot using analog film. Inception, Dark Knight, Star Trek into the Darkness, Man of Steel are a small sample of movies still shot in film [link]. 

And at first glance you might think its silly to still use film, after all film productions is significantly more expensive. The film needs to be developed(creating the negatives) then turned into digital content via a film scanner, before they can add in any CGI. Not to mention the costs of developing the film alone to be displayed.

“It costs about $1,500 to print one copy of a movie on 35 mm film and ship it to theaters in its heavy metal canister. Multiply that by 4,000 copies — one for each movie on each screen in each multiplex around the country — and the numbers start to get ugly. By comparison, putting out a digital copy costs a mere $150.” [Link]

If film is more expensive and harder to work with, why do they still use it? One reason is that film is analog and is not really subject to resolution boundaries like digital. The reason why they can remaster Indiana Jones, Star Wars (4,5,6) and all those other old films to HD(1920x1080) is because all they have to do is rescan the negative at a higher resolution. Where as with digital you're stuck with the resolution you shot in. So when 4k(3840 x 2160) comes out the old movies will be rescanned at sold again and when 8k(7680 x 4320) comes out its the same process.

Of course there is a limit to how much resolution you can get out of an analog film. Theoretically all you have to do is:  ((lpm * w * 2) * (lpm * h * 2)) / 1 000 000 = Mega Pixel count [link]
Where:
lpm = lines per millimeter
w = width of film in millimeter
h = height of film in millimeter

So according to that a 35mm(academy ratio) film at 160 lines per millimeter would result in ((160 * 22 * 2) * (160 * 16 * 2))/ 1000000 = 36MP

This is of course assuming perfect conditions which never occurs in the real world. And other have done side by side comparisons between the two. It turns out that in practice 35mm film is about 9MP and if you take Kenrock Well's lie factor into account its 18MP [link]. Digital film cameras are not yet at that level, although the gap is narrowing and more films are being shot in Digital [link] such as the Hobbit shot using the RED digital camera.