USB 3 Truly Legit for recording now. Eating Crow.

Yup. There are certainly advantages to USB3--the "nearly 5 watts to play with" being a huge one by my reckoning. It just seemed to me that the discussion was focusing on data throughput when that's an area that only a tiny minority of users will see any practical difference at all.
 
Another update:

It looks like Thunderbolt is still going to be about twice the speed of USB3. Not based on simply transmission speed, however.

Thunderbolt uses dual channels. Each channel is rated at 10Gb/s, whereas USB only uses a single channel. So, even after the USB3 patch, you'd be getting 10Gb/s x 2 (20Gb/s) out of Thunderbolt (assuming you had both channels being used), and only 10gb/s out of USB3.

Still, though, there is almost no licensing fee to use USB, whereas there is still a licensing fee for Thunderbolt.

While that's technically true that it has two channels running 10Gbps, one is data and one is for video, so you can never run both at the same time.
If you could, Thunderbolt specs would've stated 20Gbps, because that sounds a lot better for marketing. USB3.0 is going to be the same speed for data transfer now, you just can't daisy chain it to a monitor like you will be able to with Thunderbolt.
 
Yup. There are certainly advantages to USB3--the "nearly 5 watts to play with" being a huge one by my reckoning.

Yes, on the assumption that USB 3 sub-systems - that is, motherboard usb ports, USB adapter cards, etc - will in fact provide for full power on available ports. The USB 2 reality is that the specified 500ma per port is not at all guaranteed, hence the need for powered hubs, etc to get reliable operation from various peripherals.
 
While that's technically true that it has two channels running 10Gbps, one is data and one is for video, so you can never run both at the same time.
If you could, Thunderbolt specs would've stated 20Gbps, because that sounds a lot better for marketing. USB3.0 is going to be the same speed for data transfer now, you just can't daisy chain it to a monitor like you will be able to with Thunderbolt.

Thunderbolt going for 20Gbps
"Fricking weeks after its release Intel is about to change ThunderBolt into a 20Gbps connector.

The speediest Thunderbolt controller code-named Falcon Ridge will support speeds up to 20Gb/s as well as DisplayPort v1.1a and DisplayPort v1.2 Redriver. The latter technologies will also be supported by the Redwood Ridge controllers due in 2013, reports DigiTimes web-site. Given the fact that now Intel reportedly charges $20 per TB controller, the new ones should be more affordable.

The new - Falcon Ridge - Thunderbolt chip clearly improves speed by implementing PCI Express 3.0 technology outside the PC. From what has been told, it is obvious that it will continue to use copper cables."
 
Re the "nearly 5 watts" factor.
Not quite sure what the bitch is about or who is saying what?

I have used 3 usb 2.0 AIs on 3 desktop and 3 laptop PCs. Tascams 122 and 144 and an NI KA6. Computers were an old P4 desktop, newer Asus MOBO dual core AMD DTOP, HP, dcore W7/64 DTOP. HP XP lap. HP W7 lap. Toshiba Vista (!) lap. In none of those cases did I have any power problems (even Vista worked from the go get!)

But the present 2.5W limit is a barrier to bus powered operations. Just providing spook juice to spec would mean the mic input limit is about 6 even if you had no other power to find. Headphone outputs tend not to have a lot of grunt and it is very rare to find more than one provided on bus powered AIs.

And no fair saying "they" won't provide the full power for usb 3.0 that is a failing of implementation, not the system. "THEY" never did supply FW power on laptops, much. And has every FW port on every desktop got a full tank? Don't think so, but I do not criticize the system for that.

Dave.
 
Re the "nearly 5 watts" factor.
Not quite sure what the bitch is about or who is saying what?

...

And no fair saying "they" won't provide the full power for usb 3.0 that is a failing of implementation, not the system. "THEY" never did supply FW power on laptops, much. And has every FW port on every desktop got a full tank? Don't think so, but I do not criticize the system for that.

Dave.

Are you referencing my earlier post? https://homerecording.com/bbs/gener...cording-now-eating-crow-352171/3/#post3998584

I don't believe I've made any critical comments about USB 3 whatsoever. Surely raising a caveat is not a criticism? And my comment was *specifically* about implementation.

I have certainly had bus power problems with USB 2 (albeit not with audio gear but with dvb tuner devices).

For my part, I remain content to sit back and observe before passing judgment. I just tend to be slightly dismissive of hype until things are proven in the field.

Paul
 
Well, shoot. If Thunderbolt has hit 20Gbps instead like Mason's link says, that's insane.
Of course at 10Gbps you're getting about the same latency as a PCIe HD rig, so if you're hitting 20Gbps, I have no clue what you're looking at as far as latency goes then. xD
 
Er, the data throughput to/from your interface has a relatively small effect on latency. What's happening inside your computer and DAW is much more significant.
 
Well, shoot. If Thunderbolt has hit 20Gbps instead like Mason's link says, that's insane.
Of course at 10Gbps you're getting about the same latency as a PCIe HD rig, so if you're hitting 20Gbps, I have no clue what you're looking at as far as latency goes then. xD

As I stated near the beginning of this thread, the throughput of the audio transport has no bearing on latency. Latency is determined first by how the audio interface/manufacturer utilizes existing transport protocols and second by the anti-aliasing/anti-imaging digital linear phase filters employed in the encoding/decoding of analogue to digital and visa versa. The latter is highly dependent on manufacturer design, hence the variety of latency specs we see from interface to interface.

Take AoE (Audio over Ethernet) networks for example. There are many manufacturers that have their own version of this technology and most of them use similar IT hardware with the same standard CAT5 Ethernet cable. The hurdle here is that because standard IP-based data transmission is "packed-based", there is an inherent latency involved because packets are not always received in the order that they are sent. In a standard IP network the system incurs latency in order to organize these packets into a coherent form. This is why non-proprietary AoE flavors usually (there are exceptions) have higher latency than proprietary flavors. Take AES50/Supermac/Hypermac for example. This AoE protocol bypasses the QoS (Quality of Service) barriers inherent in IP-based networks and employ "frame-based" transport based on what is called Ethernet frames. This sends the audio from "point to point" and is much more efficient, thus supplying end-to-end transmission latency measured in MICROseconds, often just a few samples, never mind milliseconds. Cobranet is a non-proprietary protocol that has a latency of no lower than 5ms because it uses standard packet-based IP networks in order to be compatible with off-the-shelf IT hardware. AES50/Supermac/Hypermac is not.

Just as a reference, Supermac can supply 24 channels of 48kHz audio through a single CAT5 cable on a 100mbps network at a latency of around 62.5 us (microseconds). USB 2.0 has a max throughput of 480mbps.

Cheers :)
 
Are you referencing my earlier post? https://homerecording.com/bbs/gener...cording-now-eating-crow-352171/3/#post3998584

I don't believe I've made any critical comments about USB 3 whatsoever. Surely raising a caveat is not a criticism? And my comment was *specifically* about implementation.

I have certainly had bus power problems with USB 2 (albeit not with audio gear but with dvb tuner devices).

For my part, I remain content to sit back and observe before passing judgment. I just tend to be slightly dismissive of hype until things are proven in the field.

Paul

Sorry Paul.
No, not you or in fact anybody in particular. I just got the idea that the power hike was being sidelined a bit when in fact (maybe cos' I understands watts and stuff and not the "digitals"!) I think it is going to make quite a difference to the sort of AI we see...WHEN the buggers remove the digit and MAKE some!

Dave.
 
Back
Top