Context Adaptive Binary Arithmetic Coding

Friday, January 2nd, 2015

Context adaptive binary arithmetic coding, often abbreviated as CABAC, is an essential part of modern video compression. It performs lossless compression of a binary data stream without any prior knowledge of the particular characteristics of the data. These features make it an attractive algorithm for video compression because it is simple, effective, relatively fast, and requires no prior (or side band) communication between a client and server.

The most well known codecs that leverage CABAC are H.264/AVC and H.265/HEVC. My own codec, P.264 also leverages a version of CABAC, and I recall during the early days of P.264 development that there were surprisingly few sturdy resources available for learning more about the details of this system.

The goal for this article is to provide a gentle introduction to the process, without delving too deeply into the underlying theory of it. Additional resources for further reading will be linked inline throughout the article. Also, for those unfamiliar with basic compression concepts, check out my previous article on compression fundamentals.

Continue reading...

Fundamentals of Compression

Friday, December 26th, 2014

Compression is the process of reducing the storage space requirement for a set of data by converting it into a more compact form than its original format. This concept of data compression is fairly old, dating at least as far back as the mid 19th century, with the invention of Morse Code.

Morse Code was created to enable operators to transmit textual messages across an electrical telegraph system using a sequence of audible pulses to convey characters of the alphabet. The inventors of the code recognized that some letters of the alphabet are used more frequently than others (e.g. E is much more common than X), and therefore decided to use shorter pulse sequences for more common characters and longer sequences for less common ones. This basic compression scheme provided a dramatic improvement to the system's overall efficiency because it enabled operators to transmit a greater number of messages in a much shorter span of time.

Although modern compression processes are significantly more complicated than Morse Code, they still rely on the same basic set of concepts, which we will review in this article. These concepts are essential to the efficient operation of our modern computerized world — everything from local and cloud storage to data streaming over the Internet relies heavily on compression and would likely be cost ineffective without it.

Continue reading...

Perceptual Hashing

Wednesday, May 28th, 2014

Hash functions are an essential mathematical tool that are used to translate data of arbitrary size into a fixed sized output. There are many different kinds of these functions, each with their own characteristics and purpose.

For example, cryptographic hash functions can be used to map sensitive information into hash values with high dispersion, causing even the slightest changes in the source information to produce wildly different hash results. Because of this, two cryptographic hashes can (usually) only be compared to determine if they came from the exact same source. We cannot however measure the similarity of two cryptographic hashes to ascertain the similarity of the sources.

Continue reading...

Knapsack Problem

Sunday, March 9th, 2014

The knapsack (or backpack) problem is a classic dynamic programming problem. While there are a lot of variations of this problem, this post will only focus on the 0/1 version. This challenge was formally introduced over a century ago and pops up in many different areas including cryptography, resource management, and complexity theory. It is also a popular challenge in programming interviews at several large companies.

Continue reading...

Primitive Texture Compression

Wednesday, February 5th, 2014

PTCX a very simple compressed image format that I designed in 2003 as part of my Vision 1.0 project. This format features a basic adaptive quantization scheme that is reasonably effective for low frequency texture information (e.g. grass and gravel), and supports a wide variety of pixel formats (and high quality alpha).

My goal for this project was simply to explore image quantization and create something similar to the DirectX Texture Compression (DXT/S3TC) formats but with significantly greater flexibility (albeit without hardware support!). I dug PTCX up the other day and decided to see how it performed against the Lena test image.

Continue reading...

Technical Interviews 2.0

Saturday, November 16th, 2013

It's no secret that technical interviews are tough to get right. They're draining on the candidate and are a notoriously difficult way to assess a person's technical qualifications. Over the last 8 years I've been heavily involved in the interviewing processes at a number of different companies. I've given over 250 interviews, served on several hiring committees, and screened countless resumes.

After a while I started to realize that many great candidates were incorrectly discarded by the process, while many less than stellar candidates were managing to get through. If technical interviews are a key discriminator for the hiring process, they do not appear to be very effective.

Continue reading...



previous page    |    1    2    3    4    5    6    |    next page