What kept us from creating the USB-C we have today 10/20 years ago?

817 views

USB-C has insane transfer rates, enough to power multiple screens, you can even use it to charge devices at the same time and the ports have a size that is already dangerously small. Genius!

What kept us from having this, 10, 20 years ago?

In: Technology

4 Answers

Anonymous 0 Comments

> What kept us from having this, 10, 20 years ago?

There were ways to transfer crazy amounts of data back and forth 20 years ago – for supercomputers and internet infrastructure for example. Display cables as well supported a high bitrate: The highest resolution screens that were common in the 90s had a resolution of 1600×1200, running at 60 Hz. In terms of data, that is nearly 3 Gbit/s! That is still a bit shy of USB-C, but it’s much more than USB could handle back then.

However, this came at a price: The hardware you needed to run a computer screen at such a resolution was expensive, and it needed a lot of power. There was just no way to integrate that into a small mobile device. On top of that, there were no devices that could handle speeds like that – no lightning fast flash memory like we’re used today, the fastest thing you could possibly connect to USB were harddisks transmitting data in megabits per second, not gigabits.

However, there was actually a standard that could run much faster than USB back then: [Firewire](https://en.wikipedia.org/wiki/IEEE_1394) could reach speeds as high as 400 Mbit/s. This was used for stuff like external hard drives, which were often faster than what USB could handle.

Anonymous 0 Comments

There have been lots of improvements in transfer rates in a number of areas. The issue with high data rates was that if you sent data too fast then the individual bits you sent down the line would interfere with each other and end up just a blur on the receiving end of the cable. Today we have a lot better knowledge of how this happens and what we can do to prevent it or interpret the blur so we get the data through. In addition we have found out what kind of cable works best and how to make it on the cheap. The issue was that USB did not use that cable or connector. So while other technologies like PCIe, SATA and HDMI were able to use the new technology as it came out USB was left behind. The USB 3.x standard was designed almost 10 years ago but it required new connectors, new cables and new controller chipsets. Thus it took some time to become common.

Anonymous 0 Comments

The transfer rates and power delivery features are largely independent of the connector, although the b/mini-b/micro-b connectors needed substantial revisions for USB 3’s extra data transfer connectors.

Basically as time went on, there were a lot of needs that couldn’t be handled easily by existing connectors, so a new one was eventually introduced to fit those needs.

Anonymous 0 Comments

We could easily have had the cables and the connectors.

We could not have built the sort of gadgets that would be able to make use of that.

Stuff like the ability to plug in the port upside down and have it work would easily have been doable 20 years ago.

The more power aspect that allows you to power and charge devices at up to 100 Watt is a bit trickier but in general something we could have made work decades ago too.

The transfer speed of up to 10Gbit/s is another thing entirely.

10Gbit/s transfer speeds are not new. 10Gig Ethernet has been around for a while by now at this point and stuff like Infiniband from Mellanox has been available for about two decades now. It was just very, very expensive back when it first came out. (It is still expensive today, too)

So, in theory it would have been possible to built something like USB-C in the year 2000, but it would have been very bulky and extremely expensive and it would not have caught on as people wouldn’t want to for example add a port to their smartphone that trippeled its weight and increased the price by factor of ten or a hundred.