Adjust input level of audio driver?

For discussion of the WDM and ASIO Axia Drivers

Moderators: AXIA_milos, Maciej

Post Reply
JonCyphers
Posts: 10
Joined: Tue Feb 12, 2008 10:04 pm
Location: Wichita, KS

Adjust input level of audio driver?

Post by JonCyphers »

WinXP SP2 running SoundForge with a single Axia audio driver.

When I feed tone from a CD and make the meter on the Element's VGA read a perfect -20 mark on the meter, Sound Forge is reading -7.8, so when we run program material everything is close to or is clipping. I just have people ignoring the element meters for now, but that's not a practical approach.

The audio driver is being fed from the PGM4 of the mix engine.

I don't want to drop the PGM level out of the mix engine (not even sure I can), and I can't find anyway to reduce the input level for the driver.

The same PGM out feeds our Protools interface on the same machine via analog from an analog node, and the level looks proper with plenty of headroom.

any help on finding that adjustment? Or what have I done wrong?
Thanks!
jon

User avatar
AXIA_milos
Axia Team
Posts: 281
Joined: Fri May 25, 2007 2:54 am
Location: San Luis Obispo, CA

Post by AXIA_milos »

Go to Control Panel
Look for the Icon "Axia IP-Audio", Select it
Bottom left corner is an Audio level adjustment.
The adjustment is made to state what is the "nominal" recording level.
If your "nominal" recording is the same as your console (-20 dbFS) nominal level, then move the adjustment as such. You will notice the Record Trim will be set to 0 and so will the playback trim. If you record at a high audio nominal level, for example, -10dbFS, the record trim will adjust to +10db (meaning there will be a gain of 10db on the audio coming into the PC) and the playback will adjust the gain of the file being played by having a trim of -10 db.
Also note, your record meters of most PC based applications will show you dbFS where with Element 2.0, the meters are not dbFS but a VU behaviour with a Blue marker that shows you peak points.
-milos

JonCyphers
Posts: 10
Joined: Tue Feb 12, 2008 10:04 pm
Location: Wichita, KS

Post by JonCyphers »

Thanks Milos, that was easy. That fixed our iProfile also, all of those drivers must be defaulted to -8 (who uses -8?)

User avatar
AXIA_milos
Axia Team
Posts: 281
Joined: Fri May 25, 2007 2:54 am
Location: San Luis Obispo, CA

Post by AXIA_milos »

Levels are all over the place. With the introduction of Digital Editing I think it has gotten worse. From my experience, a lot of production has been produced at hot levels. I once worked at a facility that had become so accustomed to the meters on a Panasonic DAT unit that the facilities "nominal" level when they installed a Digital surface was set to -10dbFS (because production had to have both meters match). Also, the different types of meters out there (VU, dbFS, BBS, etc), it all turns into a big mess once you start mixing them together. So my best advice, understand what the meter is telling us and move from their.

stevechurch
Fearless Leader
Posts: 5
Joined: Fri Jun 01, 2007 4:02 am

Meters all over the place

Post by stevechurch »

Milos is right about that! Here's an excerpt from a paper I'm writing on just this topic...

It is said that fish have no chance to understand the nature of water. Perhaps the same is true of American broadcast engineers who have been immersed in VU all of their working lives. For one, I can say that I had been blissfully ignorant until a recent conversation with a European broadcast practitioner caught me by surprise and set me on a course of discovery. Over a course of fish and chips (or was it tea and crumpets?), he told me that 6dB console headroom was enough and 9dB was usual and plenty. Huh? Wouldn’t that mean pretty much full-time clipping and distortion? Turns out, no.

Volume Units were originally developed in 1939 by Bell Labs and broadcasters CBS and NBC for measuring and standardizing the levels of telephone lines. The instrument used to measure VU was called the volume indicator (VI) meter. Everyone ignores this, of course, and calls it a VU meter. The behavior of VU meters is an official standard, originally defined in ANSI C16.5-1942 and later in the international standard IEC 60268-17. These specify that the meter should take 300 milliseconds to rise 20dB to the 0dB mark when a constant sine wave of amplitude 0 VU is first applied. It should also take 300ms to fall back to -20dB when the tone is removed. This integration time is quite long relative to audio wavelengths, so the meter effectively incorporates a filter that removes peaks in order to show a long-term average value.

But have you noticed that digital devices like PC-based audio editors usually have peak-reading meters? Their waveform displays are like storage oscilloscopes that can accurately show peaks and the meters are made to correspond to the levels traced by these editing displays. They are usually marked in dBfs – that is, dB down from full-scale. And this is how we need to think about audio levels in the digital context.

With analog, the numbers on the meter are relative to whatever value we decide to choose for the voltage level on the connection circuit. We nonchalantly misuse the decibel as if it were a voltage when in fact the dB is the logarithmic ratio of two voltages. A VU meter is actually an AC voltmeter with strange markings. 0dBu is corresponds to .775Vrms and the other values are referenced to that. While this voltage may seem an odd choice, when applied to the 600ohm load used by vintage gear, the power dissipated is 1mW, a nice, clean point-of-reference. The modern USA practice is that the 0VU mark on the VU meter corresponds to +4dBu, or 1.228Vrms. (The u in dBu stands for unloaded. This is in contrast to dBm, which assumes the 600 load.) Not so long ago, +8dBm was the norm, and other countries use a variety of values today. With digital systems, we have an unambiguous and universal anchor – 0dBfs as the maximum absolute clipping point. That is why DAT tape recorders abandoned the VU meters that were common on analog tape machines for bargraph meters marked in dBfs.

Turning back to headroom, let’s first consider the analog case. The clip point for most modern studio equipment is +24dBu. With +4dBu nominal operating level, we arrive at 20dB for headroom. If we want the same headroom in a digital system, we should set our nominal operating level to -20dBfs. And this is just what USA TV and film people usually do, following the SMPTE recommendation RP155.

With all this as background, we’re ready to rejoin our delightful lunch companions with the refined accents and milky tea. Americans love their slow meters, but the Brits say that VU means Virtually Useless – and they have a point. For as long as American broadcasters have been staring at VU meters, our British cousins have been gazing into their beloved BBC-style Peak Programme Meters. These have a rise time 30 times faster than VU meters and a fall-back time of 2.8 seconds. Because of the slow fall-back time, they look lazier than a VU, but actually they are much more accurately registering peaks. British engineers usually set their maximum operating level to -9dBfs in digital systems, so they have 6dB (9 - 3 = 6) for the excited sportscaster reserve. Hmmm… isn’t that the same as for the VU contingent stateside? Yup.

As those well-known audio engineers Led Zeppelin once sagely informed us, sometimes words have two meanings. Context matters. When Americans speak about their 20dB headroom, it is in the context of their slow attack-time VU meters. When Brits pronounce upon their “perfectly adequate” 6-9dB headroom, they are referring to systems with fast attack PPM indicators. Both achieve about the same result most of the time. And one’s interpretation of headroom depends upon what’s doing the metering.

England is but one European country, and the others have different ideas. The BBC-style PPM has only numbers from 1 to 7 and no other markings to tell operators the normal operating level or the maximum level. (To aid novice operators, the BBC's motto, Nation shall speak peace unto nation, has been adapted to Nation shall peak six unto nation.) The European Broadcasting Union has opted for a more transparent approach. The EBU digital PPM is labeled with dBfs values, has a “reference level” mark for system alignment at -18dBfs and a color transition signaling permitted maximum level at -9dBfs. The new German IRT meter is the same as the EBU meter, but is labeled somewhat like a VU with a 0dB mark and green/red transition at -9dBfs and a reference level mark at -9dB, which corresponds to -18dBfs. Yet another variant is the Nordic N9 meter, which has the word TEST marked at -9dBfs and a compressed scale above this point.

One advantage of the VU meter is that unlike the European meters, the nominal level, the reference level, and the permitted maximum level are all the same – 0VU, marked with an unmistakable scale change from a black line to a big red bar. The phrase nominal level was invented for VU meters to suggest that the value is not “real” but a filtered compromise and approximation.

What about the rest of the world? Mostly, Latin Americans and Asians have followed the USA’s VU meter approach. Even in Europe, French, Spanish, Italian and most non-public broadcasters everywhere favor the good ‘ol VU. (Although often with a shift in the analog level that corresponds to the meter 0VU.)

Post Reply