×

THE IMPORTANCE OF UNDERSTANDING BANDWIDTH


If there’s one word that gets banded around in our industry more than any other, its ‘bandwidth’. It's even made its way into the general office lexicon (“sorry boss, I don't have the bandwidth to do that”).

With the recent announcement of the HDMI 2.1 standard promising a staggering 48Gb/s capacity, there’s never been a more important time to fully understand the term ‘bandwidth’.

For an engineer, the word ‘bandwidth’ can mean many different things and its real meaning often needs additional context. 

Geoff Meads, Managing Director of Presto AV (www.prestoav.com) and Lead Instructor for CEDIA EMEA (www.cedia.co.uk), explores the importance of ‘bandwidth’ and exactly how much an home technology professional needs to know.

To understand the term ‘bandwidth’ is to understand the capacity of an installed system, specifically, how much data will ‘fit’ within it. Without this understanding, we can have no real idea of a system’s real-world performance or likely reliability.

Definitions
Scientifically, we consider the term ‘bandwidth’ to mean ‘the amount of data that can be transmitted and wholly received through a given medium within a specific time’.

The ‘medium’ in question might be a cable, a processor or even a complete transmission system such as HDMI. The ‘specific time’ is usually one second. It is important to highlight the phrase ‘wholly received’ within this definition – it is crucial to understand that whatever we send must ALL be received correctly.

Analogue vs. Digital
In the analogue domain, we use the terms ‘cycles per second’ or ‘Hertz’ to measure bandwidth, whilst for digital we use ‘bits per second’ (b/s). In either case, we use standard international multipliers to shorten larger values – ‘Kilo’ or ‘K’ for x 1000, ‘Mega’ or ‘M’ for x 1,000,000 and ‘Giga’ or ‘G’ for x 1,000,000,000.

If we take a twisted pair data cable like Cat5e as an example we’ll see different values quoted for analogue and digital bandwidth. For example, a Cat5e will carry 1,000,000,000b/s (1Gb/s) of data over a limited length, but the cable specification might state around 350MHz for its analogue capacity.

So, what’s the difference?
The answer lies in the format of the data being transmitted. An analogue signal is constantly varying. Whilst we often consider an analogue signal to be a sine wave, in truth, it can take all sorts of shapes. For example, two pianos playing the note A4 will both have a fundamental frequency of 440Hz, but the shape of the sound wave they emit (and the shape of the resulting electronic wave when the sound is passed as an electrical current down a cable) will be subtly different. We hear this difference as a tonal difference between the two instruments. So, for analogue signals we need to transmit and receive not just the right number of waves per second but their exact shape too.

With digital, it's a little different. While its ideal to maintain the exact shape of a digital wave within the transmission system, each ‘bit’ of received data will be interpreted at the receiving end as simply a ‘1’ or a ‘0’. As long as the receiver understands the intent of the signal (‘1’ or ‘0’) the exact shape of the wave is not so important.

This is one of the fundamental advantages of a digital system but also means the digital bandwidth for a particular medium is often higher than its analogue bandwidth. This is because the system continues to work even though the wave shape might become distorted with higher frequency.

Installer Realities
Bandwidth restrictions are all around us and we must make the best use of the bandwidth that we have. We also mustn’t try and squeeze too much data along a medium that can’t carry it if we want the system to work consistently.

We’ve only covered the very beginnings of the subject here but one thing is for sure; if you’re interested in building systems with maximum reliability and flexibility you owe it to yourself to fully understand this important subject.

As featured in the January edition of Essential Install.