Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
USB-C is about to go from 100W to 240W, enough to power beefier laptops (theverge.com)
428 points by Tomte on May 26, 2021 | hide | past | favorite | 478 comments


I look forward to USB 4.1a Type C.2 Phase iii.b Gen 3E and the 482646282 different cable capability combinations that all plug in and have no distinction of those capabilities whatsoever. I also look forward to all the necessary and helpful posts telling me how the identical ports on my laptop are in fact diffferent despite their identical appearance.

But at least everything uses the same plug, so that’s nice.


It actually is sort of nice, because "works" isn't binary. It's nice to be able to charge a laptop with what you have, even if it doesn't charge quite as fast. You can transfer data over a cable at lower speeds even if it doesn't run at top speed.

Using the "right" cable is performance optimization. This isn't like the old days when plugging things in the wrong way might damage your machine.


Not using the proper cable could lead also to fires, for example. We are talking about a significant amount of current. If you don't use a cable rated for that current, the cable or the connector will overheat, and possibly start a fire.

And it's not a remote possibility. It happened to me with a fast charging phone power supply and the phone, and the cable was the one provided by the manufacturer! The type-C connector at the phone side was red hot and started to melt, probably caused by a bad connection caused by dirt in the phone connector. Fortunately I was there, and smelled the burnt plastic and disconnected the cable, but what if that happened at night?

And it's not something trivial that power supplies and phones can detect, there is not a way to determinate the voltage drop in the cable built into the standard (basically all they needed to do was to add a voltage sensing pin on the connector, to be shorted with VCC at the load side, so the power supply could sense the voltage at the other side and determinate if there too much drop, but as far as I know it doesn't have it).


It should be impossible for any set of non-damaged, non-defective, spec-compliant set of USB-C cables and peripherals to cause fires. The specifications are very carefully designed to prevent any combination of cables and devices from causing damage.


It should be impossible for any set of non-damaged, non-defective, spec-compliant set of USB-C cables and peripherals to cause fires.

Then there's the crap offered for sale on Alibaba which somebody will resell on Amazon and which will show up on a checkout rack at the gas station. USB-C has very tight tolerances and unusual material requirements. Dirt or water can create a conductive path all too easily with USB-C pin spacing.

"Decreasing the risk of fire in USB-C connectors" is worth watching.[1] It's an ad for a plastic material, but covers the problems.

[1] https://youtu.be/jYqDh9q5H6I


I’ve never heard of USB-C causing a fire. The control ICs should cut the current the moment anything is out of spec. Do you have any examples of that?


https://www.reddit.com/r/Android/comments/7j3k38/anker_usbc_...

https://wccftech.com/macbook-survives-disables-all-usb-ports...

Researchers even developed an exploit for some USB-C fast chargers that forces it to deliver max power and try to catch the connected device on fire: https://www.youtube.com/watch?v=Es2SqubYSfo


The problem is the gazillion of proprietary fast charging standards.

Some of them were designed for chargers with attached cable, and not disconnectable one.

Many charger ICs emulate the fast charging signalling to trick phones into charging faster, but they of course have no way to check if wiring is ok on micro-usb.

So you can end up with 5A-10A going over flimsy wires/contact pads. Or if they use type-c, they can end up using fake 5A cables. Or in case of VOOC, they don't check if the cable is an actual VOOC cable, which goes over 5A.


>Or in case of VOOC, they don't check if the cable is an actual VOOC cable, which goes over 5A

Actually they do. My OnePlus wich uses the same VOOC tech will not charge at full power with its fast charger if I don't use the cable from the box.


Do you use A to C cable, or C to C?

VooC A to C has a fifth pin mapped to CC pin on C side.

Rogue VooC C to C chargers just send VooC signalling directly on CC.


Mine is A to C.


The comment two above yours is not enough?

Red hot metal and smoking plastic.


Yeah but there will always be crap for Alibaba for anything you can think of. The solution would be banning charging altogether and adopt only proprietary standards.


No, the solution for the US would be to refuse imports of any power device that lacks UL/CSA approval.


I don’t even think that is required. I think what might be required is simply making it more explicit when a device is or is not UL listed. I shouldn’t have to search Amazon’s “questions” thing on a product listing to see if it is UL listed.

Of course I suppose products that actually are UL listed on Amazon will already mention the fact somewhere in their description.

That being said I’ve had more issues with low voltage crap letting the magic smoke out than mains powered devices.


What about the 96% of human beings that buy their electronics outside of the USA?


they have their own corresponding government entities enact the corresponding regulations


The solution is obvious, they just need to be constructed of prefabulated amulite.


That's not entirely a joke. "Halogen-free polyamides are explicitly recommended by the USB-IF in the USB 3.1 specification."[1] Most common plastics do not have a high enough resistance to surface tracking, where, as the connector wears, a conductive path appears across the surface due to dirt and moisture.

[1] https://www.dsm.com/content/dam/dsm/electrical-electronics/e...


...as long as you have the necessary hydrocoptic marzlevanes. You don't want side fumbling.


The Nintendo Switch can be bricked if you use the wrong cable, although my understanding is that it's not built to the USB-C spec: https://www.reddit.com/r/NintendoSwitch/comments/87vmud/the_...

Either way, this is an incredibly common device that can be damaged by incompatible hardware where it's difficult to determine compatibility without bricking the device.


> it's difficult to determine compatibility without bricking the device

This is the problem! This is what everybody talking about the specs doesn't get - it's difficult to (1) tell which spec a device claims to support and (2) verify that it actually supports it (see: a lot of cheap devices on Amazon) and also (3) in real life many devices do not support the specs (i.e. this isn't a theoretical problem).


Unless I'm mistaken, there is a license to call a port a USB port, though goods are commonly sold without that. In theory the solution is simple: only buy products that have the actual USB logo and ensure certification, and the USB-IF should retain the right to significantly fine any manufacturer advertising being to spec and using the logo who is in fact not.

In practice, people aren't going to stop buying nintendo switches, unfortunately.

Still, its not entirely necessary to support all specs or have that be entirely clear, what does need to be clear is if you can expect safety specs to be followed. If I plug in my nintendo switch to a charger thinking it'll charge at lightspeed but it takes hours, oh well. If I plug it in and it destroys the device then that should be pretty much an unforgivable problem. Personally I'd be happy to abstain from buying such a device due to that, though also personally I did buy one, expecting no issues, and only found out months later when seeing it mentioned somewhere online. Ideally research is done on every product to ensure things like this aren't the case, but again in practice, that doesn't always happen.

I'm not sure what the fix is, other than outright making USB not so universal by requiring a license for any and all vendors using the design, if the license is cheap enough, maybe that could work? I don't know the legalities so much but maybe a free license could be required for any implementation, which would only have the spec of "make sure it doesn't brick charger or device, and make sure it doesn't catch on fire"


> only buy products that have the actual USB logo

I just took my Samsung A50 out of the case, no USB logo anywhere on the device. There's the Samsung logo, some recycling logo and a CE logo, and that's it.


One solution could be to require different connectors for different protocols and power delivery.

I'm beginning to suspect The Great Filter is Jony Ive's minimalist design influence: any sufficiently advanced culture will annihilate itself before it discovers interstellar travel when someone connects two devices together that aren't compatible with the equivalent of a One True Connector™ USB cable.


I like to imagine there's an alien species zipping around in FTL space ships, and the only difference from us is they picked one endianness and ran with it.


> Either way, this is an incredibly common device that can be damaged by incompatible hardware where it's difficult to determine compatibility without bricking the device.

This is 100% Nintendo's fault for using the USB-C connector type but not actually bothering to adhere to the USB-C specification.


Exactly. If you're going to base your interconnect on an industry standard, you have two options as far as I'm concerned:

A) Follow the spec

B) Make your off-spec interconnect physically incompatible with standard devices

Going off-spec with a compatible and physically indistinguishable connector is an "attractive nuisance," to use the tort term.


It would be OK if they didn't claim to be complaint, and FAILED SAFELY when used with complaint devices.

The issue is that they're both not compliant AND fail in a non-safe bad way.


IMO the issue is that they fail unsafe, whether or not they claim compliance.

Nintendo doesn't claim compliance, but if they're using the connector shape, it should comply with the spec for power delivery (which is honestly not even hard to do).


again though, this is bad but not a flaw in usb-c. a manufacturer created a device that violates the spec.

MicroUSB devices can also be bricked by using non-conforming cables and power adapters, but fortunately devices like that are rare because the spec has existed long enough that all the bad devices have been flushed out of the market. manufacturers having made stupid decisions in the past does not mean we shouldn't make USB-C better now.


It absolutely is a flaw in usb-c that a large manufacturer can create a device that violates the spec and not face serious repercussions.


Is this actually a current problem? I haven't heard of this issue occurring at all in recent years beyond "better safe than sorry" warnings against it when people still ask, and even the original claims didn't seem especially widespread.

While it's anecdotal, I've been charging Switches in my household with all sorts of USB-C sources for a couple years now as well (20W Apple chargers, 18W Anker bricks, a MacBook Air, external battery packs, 12v car adapters, etc) and they're all still working.


> It should be impossible for any set of non-damaged, non-defective, spec-compliant set of USB-C cables and peripherals to cause fires.

That is a fairly significant list of conditions. As an example, I have encountered significant heating in a damaged power cable on the original power supply of a laptop from a reputable vendor. It's easy to argue the owner should immediately replace the damaged power supply, yet many people will continue to use problematic hardware until it no longer functions. It is also the case that many companies will manufacture components down to a price (or, in the case of this vendor, for aesthetics) rather than make something that is robust.

I find it a bit disconcerting that we're talking about delivering 240 W over a cable that consumers are unlikely to differentiate from a 15 W cable (i.e. USB-C specification) or even from a 2.5 W cable (e.g a USB cable they would have used fifteen years ago).


> That is a fairly significant list of conditions.

It is, but it's a list of things that could fail even if there was only one kind of cable. The added risk from having a bunch of different kinds of cable with varying capability is quite small.

> I find it a bit disconcerting that we're talking about delivering 240 W over a cable that consumers are unlikely to differentiate from a 15 W cable (i.e. USB-C specification) or even from a 2.5 W cable (e.g a USB cable they would have used fifteen years ago).

All these voltages are low enough that you can expect any cable to support them. What really matters is the amps, and the difference is much smaller there. Specially marked cables support 5 amps. All other USB C cables support 3 amps. Old USB cables will vary, but it's been standard fare to push 2.4 amps over them for a long time.


I find it a bit disconcerting that we're talking about delivering 240 W over a cable that consumers are unlikely to differentiate from a 15 W cable (i.e. USB-C specification) or even from a 2.5 W cable (e.g a USB cable they would have used fifteen years ago).

Yes, exactly! It’s foolhardy.

And maybe unnecessary -- haven’t Apple just shown that you can get amazing performance with very low power requirements? And shouldn’t we all be trying to minimize power usage, to try to reduce CO2 emissions, mining of rare and toxic elements, plastic waste, etc etc etc?


What you say about efficiency is right, but it's complicated.

There are many domestic devices on 110 / 220v AC using less than 250W. They are native DC, but source AC because they must.

These devices are not likely to go away soon. In fact they'll proliferate so long as the developing world is still developing.

We can also expect proliferation of domestic battery and solar: both DC sources. A standardised domestic micro-grid of DC sources and consuming devices could be a boon to efficiency and reduction of single-purpose conversion devices such as power bricks and inverters.

As an example, every type of e-bicycle currently has a different proprietary adapter. Just as mobile phones once did. Most e-bikes charge at around 40v.

With 250W USB available we will likely see a switch to USB by e-bile manufacturers, resulting in more common lower cost equipment and better interoperability.


It's a sensible vision, but USB-C is absolutely the wrong tech for it (which, from experience, means it's the tech that will get adopted). The connectors are fiddly and expensive and fragile and unreliable and require a bunch of silicon logic to work. You can't mess about with power delivery stuff, it needs to be robust and cheap. A domestic DC power plug standard should have a single voltage standard, three pins (+,-,G), and be able to tolerate being stepped on without causing shorts.


I don't doubt you are technically right.

USB-C is presumably not the final word in USB connectors. Let's hope for both vision and better implementation.

What's wrong with having multiple set voltages? If it's the necessity of semiconductors, is that really so terrible in this day and age?

On a separate note, I hope they'd have properly managed EMI in this standard. USB has so many makers of varying quality, I can imagine cheap, badly-shielded cables and connectors playing merry hell with noise.


> a laptop from a reputable vendor

Why are comments here always so deliberately vague? I'd like to know which manufacturer. It seems like an important detail.


Yeah the problem is that there's nothing to stop product designers from specifying the connectors even though the device doesn't logically support the implied standards.

Some people here talked about the Switch; but it's a general problem. For example: I have an external USB drive that has a Type A USB 3 host socket on the back of it (???). It came with a cable with a Type A USB 3 plug on one end and a USB C plug on the other end.

Now that's a combination of connectors you will definitely see elsewhere. I have such a cable that came with my PS5 for charging its controllers - but I know for a fact that it's not interchangeable with the one for my external drive.


But what does universal shape of connector has to do with?

Power negotiation applies to USB-A just as it does with USB-C... (I think)


> Power negotiation applies to USB-A

Nope it doesn't.


Which is why there's more fires with usb-a


nah, there's more fires with USB A because it's more widespread, simple as that. There are plenty C cables from your favorite online dollar store (amazon, aliexpress, jd etc) which are a serious fire hazard.


>The specifications are very carefully designed

I need assurance though that my cables were carefully designed.


buy your cables from reputable manufacturers...


> We are talking about a significant amount of current.

Are we?

I'm not sure how they deliver more power in this specification. But traditionally, the additional power (from 5W to 20W to 100W) was through additional *voltage*, not current.

> And it's not something trivial that power supplies and phones can detect, there is not a way to determinate the voltage drop in the cable built into the standard (basically all they needed to do was to add a voltage sensing pin on the connector, to be shorted with VCC at the load side, so the power supply could sense the voltage at the other side and determinate if there too much drop, but as far as I know it doesn't have it).

Surely we just set a current-limit, written into the specification. Then we choose AWG wires / connectors as appropriate to support that current.

Voltage can't go up arbitrarily, but voltage can safely be increased to ~48V or so in most applications. After 48V, humans start to get shocked / hurt, so that's probably the reasonable limit.

-----

And I'm pretty sure you can sense the voltage drop across a cable. USB-3 delivers 5V by default, and then you send protocol commands to increase your voltage to 12V or whatever. If you detect that the power-supply is only supplying 11V (after the 12V command), then its either a PSU-error or a cable-error. And I'm not sure if it even matters which is which (either way, you're not getting enough power, so your device needs to probably shut off)

You can then disconnect / reset your device and maybe stick to 5V default specs.


It's 5 amps at 48 volts, defined in https://usb.org/document-library/usb-power-delivery

That level of power delivery is negotiated after the devices are connected. The cable also has to advertise that it is capable of that power delivery.


> Voltage can't go up arbitrarily, but voltage can safely be increased to ~48V or so in most applications. After 48V, humans start to get shocked / hurt, so that's probably the reasonable limit.

Car batteries in ICE cars are nominally 12v. Grab both terminals and tell me it doesn't hurt.


Ok, done. Many many many times.

12v can't exceed the resistance of human skin so it doesn't matter how much current capacity it has. You can hold onto those terminals all day.

It's settled science. If you doubt it, tests of this are all over YouTube. It's just true.

Now your tongue, perhaps...


So my cat doesn't have to worry?


If t you short a car battery, you might not get electrocuted, but the results are unlikely to be pretty.


You can't short a car battery by touching the terminals with your dry unbroken skin, anymore than you can short a AA battery by touching both ends.


You aren't shorting the battery in this case. There's still a significant resistance (your body, particularly the skin contact resistance) between the terminals.


...it doesn't hurt. Dry skin isn't conductive enough to pass any significant current at 12V. But if you inserted electrodes into your fingers so the electricity conducted through your wet inner bits, you can be killed by a lot less than 12V across the heart.


I have never felt anything holding 12V DC in my hands. Putting your tongue across 9V "transistor" battery terminals is another thing though.


I've left a 9V battery in my pocket once and it touched a penny. I definitely felt that, lol.

But yeah, 9V and 12V don't hurt your skin at all. The worst you'll get typically is when you accidentally short it with some metal, and something becomes burning hot really quickly.

But this "burning hot" issue can happen even with 1.5V NiMH batteries or 3.3V Li-Ion batteries (even hot enough to start a fire in your pocket! Like all those vaping accidents).

That's not "shock" or "electricity", that's literally heat from some other thing messing up the battery pack. So its not really the same.


I've done this. It doesn't hurt. In fact I didn't feel anything besides the battery terminals.


When I was a kid, we had this massive high-voltage transformer that produced about 2,000 volts, at some sick amperage.

We used it to make jacob's ladders. That was fun.

Until I touched the two bars.

I woke up on the other side of the room, smelling burning...me.

That wasn't fun...


I learned this recently:

> If you receive a shock, if can affect the rhythm of your heart and cause problems later. We’re taught to go and get an ECG after receiving a shock. Like jump in the car and go to the doctor or hospital, and have someone else drive you. If you’ve received a shock - go get checked out. -jaidan

https://news.ycombinator.com/item?id=27259589


yup, always work around high voltages with your left hand (the one closes to your heart) in your pocket, to discourage unwanted currents from passing through the heart


As the old joke goes, experienced electricians touch even their wife's breasts with only one hand, just in case.


The main reason being that you don't want to be working with both hands, since then current can travel in one and out the other, directly across your chest.


I saw that too but I wouldn't spread it as a fact without a better source.


I found this, but it's more than a mild shock:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2658458/

Bing's answer-mode points here:

https://healthcare.utah.edu/the-scope/shows.php?shows=0_esno...

Always hard to know how cautious to be, with so many variables.


I recently had some industrial training that made the same point. The recommendation there was to immediately hook up an AED.

You can go into tachycardia or bradycardia (fast and slow heartbeats), so feeling sort of okay isn't the end of it.


Hadn't heard of this, and initial results had no clear link to electricity:

https://wonders.physics.wisc.edu/jacobs-ladder/


Good god, you were lucky to live!


You have no idea...

That was nowhere near the stupidest thing I've done...


It just tingles a bit if anything.


It doesn’t tingle.


From what I understand, the cable has to advertise the supported power output, it’s not like a raw power outlet that will start a fire if you use a random thin cable. A 5W USB cable will probably never receive 240W, even by mistake.


When I last worked with USB, there were all sorts of resistors over the pins telling you about its capabilities. I'm pretty sure usb-c has that as well. Your device should check the spec of the cable and inly draw the amount if power its rated for


There's a microcontroller in the cable negotiating the power transfer rate, data transfer rate, and other capabities.


Is a resistor really enough to determine if a 5W cable gets 240W?


Current is often measured by putting a small value resistor of a precisely known value and measuring its voltage drop.

I = V/R


I have also had my phone's USB-C port spontaneously self-destruct, effectively ruining the phone. As you say, burning smells and melted plastic, and I shudder to think if I hadn't been there to power it off and yank the battery.

And just this past weekend, a close friend of mine had their Galaxy S6 drain completely to 0 in a minute or two, and then claim water damage in the boot screen (which tests resistance across the port). The phone wasn't wet.

Both times, the phone wasn't even plugged in. I am beginning to look very askance at USB-C ports.


> Not using the proper cable could lead also to fires, for example

Only if the cable or device is defective or damaged. Which is true of every type of charging cable or connector.


> defective or damaged. Which is true of every type of charging cable or connector.

Yes, that's my experience with USB-C as well.

(Seriously, my Dell laptop and Dell dock connected by a Dell cable can't keep up a reliable connection. People are saying that the spec is fine, but a spec that's this hard to implement well is maybe not fine.)


Explaining to my aunt or uncle or grandparents that, no, the cable and charger they bought isn't the right one for their phone/laptop/whatever even though it fits isn't ideal. Sure, it might just charge slowly (cue family tech support call), but it could also just show "not charging" on OS X for example, which is just confusing for most people.

It's a real problem, even if tech savvy people are fine with lower charge speeds because we know that the charger/cable/device combo only supports PD profile whatever.


Even I have this problem. I bbought an amp reader and throw out anything reporting below 0.4ma/h. So many devices come bundles with poor cables my house is riddled with cables that I just never want to use for charging.

I'm actually TERRIFIED of pluggin anything in to my kids Switch other than Nintendo's official cable, incase I brick i t.

Does anyone know if they've patched this? I have some Dell laptop charger USB-Cs which I use alot, and I've had to caution everything in the family to never plug the Switch into it despite it fitting and despite every room in the house having such a charger...


The Switch cable situation is always going to be a bit finicky, because their port is on the extreme edge of the spec's tolerances - it gets shorted because it's actually possible to cross the power into the wrong terminal, IIRC, with some wiggling. Safest to go with the official stuff, which people haven't reported issues with.


The Switches USB-C implementation is genuinely broken, so this isn't the fault of the connector. A spec-compliant charger, cable and connector will never brick anything (or catch on fire, for that matter)


I think I've used:

- a pixel charger - macbook charger - HP charger

and the Switch works fine with all of them.

OTOH, if you try to use one of those for the docking station, the docking station will refuse to work

I heard that it's because it supports only a lower amount of Watts than what the Macbook charger can provide, for example... (60W or 85W) While other chargers, like the pixelbook's (45W) work just fine

But I agree that it's unnerving: I connected a disposable switch to the cables that I didn't trust, before trusting to use my main Switch with them


I understand that simply charging is fine, while video output is the riskier action because of unofficial docks that use a Switch-incompatible voltage for signalling. The wrong voltage can burn out a specific chip and require resoldering.

For what it's worth I sold the official charger/dock that came with my Switch more than a year ago and have been using only third-party chargers/cables plus a Covert Dock since then. No issues :)


I bricked my switch on a multi-device charging station. Had to mail it into nintendo and they replaced the whole motherboard. Surprisingly the entire repair process cost less than a replacement switch motherboard on ifixit.


I use an HP laptop USB C charger with my switch sometimes, if that anecdote helps.


I spend years explaining to my family that its not a catastrophe if they can't find their phone charger, it's micro USB, see, just take this one from the drawer, it doesn't matter if it says Samsung or Sony, if it fits it will work.

They were right being mistrusting apparently.


How is this any different than buying the wrong cable in the past? You used to have to choose USB2, USB3, USB3.x, micro USB, mini USB, DisplayPort, Mini DisplayPort, HDMI, mini HDMI, VGA, DVI, thunderbolt, and more.

I think people are either too young to remember this disaster, or have just outright forgotten.


Allow me to say it's slightly different, because you could basically tell the difference by just looking at the connector carefully. The knowledge that if it connects, it'll 99.9% work is too ingrained in all of us. Usb-c breaks that norm.

Now you have to carefully see what the packaging says, which can range from inadvertently unclear to downright misleading. And to add to that, you don't know what to look for in the first place if you don't have prior experience with thunderbolt/usb-c. Example, I didn't know that a cable must explicitly mention DP in order to connect to an external monitor. Because it was the first time I had an external monitor that can use usb-c.

Not really complaining, usb-c is very very nice. Just pointing the minor annoyance of learning its warts.


The issue is more the ubiquity-- when everything from my phone, to Nintendo Switch, to headphones, to flashlight, to laptop use the same cord for power and data, it's a bigger problem than the low spec HDMI cable you plug in once.


Meh, I would rather know that what I have is going to work correctly. Now we're going to have many more calls to customer service complaining that our devices aren't charging/transferring-data at the advertised rates and every rep is going to start by telling us to use only the cable that was provided with our device. I, a technical person, don't enjoy dealing with this stuff, and it's going to be hell for my non-technical family and friends (who will naturally come to me with their problems).

That it won't damage their device is certainly wonderful, but I definitely think the cons outweigh the pros.



A good example happened to me yesterday. I brought my MacBook and charger to my partner’s family’s place along with my USB-C SSD that has some files I thought I might need on it; however, I managed to forget the USB-C charging cable for my computer. I ended up using the USB-C cable that came with my SSD. It’s not charging at full speed, but it’s working!


Spot on. It all is a massive step forward from the times when my mom asked me which cable to plug in where and identical (but fully different) PS/2 connectors for mouse and keyboard were still a thing.

The fact that power doesn‘t just go through a data cable, but actually in all kinds of voltages, useful power levels and in _both directions_ is nothing short of awesome. Fun fact: Didn‘t really find the charching port on my new USB-C power bank until I realized it‘s the same as the output port...

The fact that even degradation is graceful is great. I don‘t always have the 90W brick with me when I‘m on the road with the MacBook, but more than one time I‘ve not regretted at least having a lightweight USB cable in my backpack in order to use a phone charger over night. Granted, I can‘t play games on it in that configuration, but hey, it‘ll still hold up nicely over many hours and I can still work.

Good times!


"Works" may be a spectrum, but "works optimally" is binary.

There is so much variation in the USB spec that a data transfer or a battery charge could take a few minutes or a few hours depending on which cable and which port/adapter you use, with no foolproof way to make sure you're putting together the right equipment, because regardless, it still "works", just not very well.

Don't get me wrong, it's great that it "works". I just wish it was clearer what I need to make it work optimally, aside from just using the one brick and cable that came with my device (assuming I was so lucky).


I can count the number of times I have ever had to charge my laptop with somebody else's cable on one hand. And I've been using laptops exclusively since 2006.

My current laptop (an XPS 9310) has USB-C for charging. But I would be very reticent to ever charge it with somebody else's cable, and without a 'usb condom' I wouldn't even consider charging from some random public cable (e.g. airport charging kiosks.) USB charging for laptops has the same "untrusted cable" problems as USB charging for cellphones.


> It actually is sort of nice, because "works" isn't binary.

Spoken like a true gaslit tech user.


We should give anyone involved in the USB-IF a test:

Here are 10 different cables, you must accurately describe each of their features in detail (USB speeds, charging speeds/power delivery, video out? What version of Display Port does it support?). Make one mistake and you'll be executed.

Obviously I don't advocate for that but it's damn annoying you can have one cable do so many different things and not know by looking at it. At least USB 3 was often times blue to offer a distinction.

These days I plug in a cable and prey it functions as advertised (sometimes it's 50/50).

Awful mess.


The issue isn't the number of things supported, it's that there are optionally-supported things. If every cable needed to support everything, and every host port needed to support every valid device type, it'd be fine. But instead you get cables that don't properly work with alternate modes, host ports that don't implement DisplayPort, etc. It's cheaper, but ridiculously frustrating.


Sorry but not true. To get full bandwidth really limits how long the cable can be. Like I don't think you'll find 40Gbps cables longer than about 0.5m.

This is more expensive too (eg there are generally chips in the plugs at either end to handle attenuation).

But there's obviously a use case for cables longer than that.

It is true that we have cables that support data but not power and power but not data or data at different bandwidths and so on. It's a mess. But it's not a case of simply choosing not to support optional features.


The Apple Thunderbolt 3 Pro Cable (2 m) is one of the only cables that can support everything usb-c over such a long distance [1]. But that costs 129$, which just wouldn't work for smartphone charging for which cables (usb-c 2.0 & 60w, the least you can do while still being spec-compliant) cost 7$. I think I haven't seen such a clear & easy to grasp demonstration of tradeoffs anywhere else.

[1] For the longest time, this was the only cable to do it all. I think more are available now, but they're still very much non-affordable.


No, it's a matter of expense. It's possible to embed a redriver in the cable every 0.5m, and thus get longer cables at full speed. They're just very, very expensive.

There are three alternatives: one cable type that does everything and is easy to use (but gets expensive quickly as it gets longer), multiple cable types that each do somewhat different things but use identical connectors, or multiple cable types that each do somewhat different things and use different connectors. The USB-IF went with the "multiple cable types, identical connectors" option, which is cheap but extremely confusing. The "multiple cable types, multiple connectors" is what USB was created to avoid. So the only remaining option to remove confusion is to have expensive cables.


> These days I plug in a cable and prey it functions as advertised (sometimes it's 50/50).

Are you preying on the folks at USB-IF?


I haven't had to delve into the world of trying to find an appropriate USB-C cable until fairly recently. I didn't actually understand the situation thinking a replacement USB-C cable should just work.

Oh hell no and none of the packaging actually explains what features a cable supports, none of them actually explain what the cable is actually compatible with. The little android logo doesn't mean shit. Sure, you might be able to slowly charge your phone with a particular cable, but it doesn't mean it'll do anything else it's supposed to do.


And every time I buy a new device and think I'll be able to use my existing USB-C cables I'm met with a new standard. Bought a new external GPU enclosure? My macbook charging cable wont work I need a 40gbps thunderbolt THREE(not 2) cable. Bought a new Oculus Quest? Neither my egpu or macbook cables work I need an $80 LINK cable.


I wonder if the the committee have looked at integrating colour or some other indicator into the standard so that cable/port capabilities are clear visually.

Though ofc the design challenge is for users to feel comfortable that putting red into blue won't break anything, it just might not give the expected features.


From the actual spec[1]:

> All EPR cables shall be Electronically Marked and include EPR-specific information in the eMarker as defined by the USB PD specification. As defined in the USB PD specification, EPR cables are marked as 50 V and 5 A capable. All EPR cables shall be visibly identified with EPR cable identification icons as defined by the USB-IF. This is required so that end users will be able to confirm visually that the cable supports up to as high of PDP = 240W as defined in the USB PD specification."

Both are important. I also wish devices had some UI to easily show the capabilities of a connected cable to the user. I could not find actual visual representations of the the "identification icons".

[1]: (page 143) https://usb.org/document-library/usb-type-cr-cable-and-conne...


Can confirm, sucks if you're blind.


Some kind of Braille markings, or similar, really should be a standard part of the spec.


Indicator recommendations are routinely ignored. There are already color codes for type-A ports, and there's the "lightning bolt" indicator for Thunderbolt USB-C ports, but PC manufacturers ignore them when they want their laptops to have a certain look and feel (e.g. gaming laptops want red or green everything, Apple wants aluminum everything, etc.).

Cable manufacturers are going to do whatever is cheapest.


Since the design philosophy behind USB-C is "it must always seem as though it should work", I think we'll need something like the old "tube testers" a few of us still remember.

Decades ago, when your TV was behaving badly, you would take the back off (unplug first!), pull the vacuum tubes out, and take them in a bag to the nearest convenience store. They had a "tube tester" the size of a washing machine. You'd plug each tube into a connector, one at a time, press the "test" button, and the needle on the dial would show good/poor/fail. The body of the tester was a cabinet containing replacement tubes you could buy (and which you could even test right there).

We might need to start using USB-C cable testers. Plug in a new cable, get a good/poor/fail analysis of each potential USB-C feature, then you mark your cable yourself with some labeling scheme.


God damnit - TAKE MY MONEY!

I've been having to implement tons of my own testing because no cable ever works the way you think it will. I've got a giant pile of thick super beefy USB-C cables which only transmit data at USB 2.0 speeds. Which per the USB standards committee is apparently Working As Intended, but that is definitely not ok.


I remember some drugstores had these when I was growing up. My grandpa had two of them in his garage; where he had bought the machine and the stores' collection of tubes, when they were falling out of fasion.


Excellent idea.

I remember going through the tube testing ritual with my Mom. Good times. It was really interesting to me, how my Mom could take things apart and figure things out.

(She taught me that the inside of a TV was dangerous, and kept the screwdrivers up where I couldn't reach them.)


I'm down to work on this idea.


Yeah, it's a bit of tragedy.

Resistor style stripes on cables would be fun, a stripe for each feature such as bandwidth, power, etc; perhaps already suggested (and ignored). Though complement for ports would be trickier.

Embossing symbols in the style of thunderbolt might be the way to go, even if the standard is not fully adopted.


> each feature such as bandwidth, power, etc;

Honestly that's about all you need.

Even just a bandwidth indicator would be suitable 90% of the time.


Ha, I love the color idea.


This is terrible for colorblind people.


For accessibility, assign specific texture patterns on the outside of the insulation for different capabilities. Then you can tell the capabilities by running the cable through your hands.


I actually have started down this path for my own cables -- the nylon braided ones are lower power etc.

My first attempt was different colors for each device, but was too tedious.

Three categories - low power, high power slow USB2, and 100W Thunderbolt.

- Low power: generic cables from wherever

- High power slow: cables that came with reputable charger (the vendor's cable, like Apple or Samsung; my favorite 3rd party charger is https://www.ravpower.com/products/rp-pc112-gan-tech-61w-wall...)

- Thuderbolt: I buy from https://eshop.macsales.com/shop/Thunderbolt


Or! We could go one step further, and change the connector to make sure that people use them correctly. Wouldn't that be clever!?


I don't actually want this. I would rather have the problems that come with incompatible cables than the problems which come with incompatible ports.

In particular, if my Thunderbolt ports didn't support bog-standard USB, that would suck. I would need special ports which weren't as powerful, or even more dongles, or both.

As it stands, things are.. fine, actually. I have one TB cable that works on everything, and a small handful of USB-C cables which work on most things. and a USB-C-or-Thunderbolt-and-I-don't-know-or-care to DisplayPort which stays plugged into my monitor. and a USB-C-or-Thunderbolt-etc to microHDMI for my camera.


I think that doesn't make a lot of sense.

You'd need a cross product of different cable shapes for all the different available features, since the cable could support any number of them.

On top of that, the connector has mechanical constraints to handle, and some of these connector shapes will be suboptimal and undertested


People already incorrectly buy USB A when they should have bought USB C or visa versa. Changing the physical form-factor doesn't solve this problem.

"I thought I needed a USB cable"?

Also it's a huge waste if say an HDMI 2.1 cable cannot be used on an HDMI 2.0/1.4 device. People would complain about that too.


> People already incorrectly buy USB A when they should have bought USB C or visa versa. Changing the physical form-factor doesn't solve this problem.

At least the nature of the problem is immediately apparent to such people the moment they attempt to plug the cable in. If the cable won't fit, they know there is nothing wrong with their computer; they didn't misconfigure anything or click the wrong button. The problem is unambiguous.

Some USB-C cables not working with some USB-C sockets leaves users feeling gaslit.


Except people have damaged ports forcing incorrect form factors. Also we've had tons of other standards where cables vary with performance. HDMI, DisplayPort, IDE, USB1/2/3.

You'll get people complaining about the cable not working with their new device, and then people complaining it didn't work with their old device.

Eventually USB-C will be so capable and ubiquitous, this will be a non-factor. I don't miss the days of trying to track down the right barrel connector, micro-HDMI cable or proprietary and fragile network dongle adapter. Those weren't easier days.


I would love to see somebody attempt to damage a micro-USB or USB-C port by forcing a USB-A plug into it. Not going to happen, sorry. They may as well try to force a NEMA-5 plug into a VGA port. It's nonsensical.

At a certain point on the usability continuum, user error becomes so severe that only dementia can explain it. But USB-C incompatibilities are so far from that point that arguing otherwise seems like bad faith. These USB-C incompatibilities can bite technically inclined people with sound, sober and healthy minds.


What is your suggested solution? Every potential configuration USB-C can offer needs its own physically different cable? We're going to need a dozen different form factors now. How do you do that at scale economically?

How does it help when someone inevitably buys the USB-C cable with shape y when they need USB-C cable with shape x? It physically doesn't fit, great, but they still have the wrong cable and their device still doesn't work, not even in a degraded fashion. They still have to take it back to the store.

Typically these issues bite people who bought cheap junky cables that weren't USB-IF certified off Amazon by sorting for lowest price. If it's not working, check that your cable is certified for what your intended use case is. This applies to everything, not just USB-C.


They could, at the very least, enforce color coding. Colorblind people would still be left high and dry, but with the status quo everybody is up shit creek.

> How does it help when someone inevitably buys the USB-C cable with shape y when they need USB-C cable with shape x? [...] They still have to take it back to the store.

It helps because they know they have to go back to the store, and aren't left wondering if the problem is actually with themselves somehow using their computer wrong. As I mentioned earlier: "Some USB-C cables not working with some USB-C sockets leaves users feeling gaslit."


USB3 did have blue color coding on the plastic interior of the connector. That doesn't exist on USB-C. You'd have to color-code the metallic connector or the connector housing. Forcing a color-coding scheme on the connector housing would clash with branding, so you'd likely end up with companies ignoring the color coding.

People will also ignore the color coding even if it existed. Counterfeiters would add the color to add legitimacy to their incompatible products. Color coding would not physically prevent you from plugging the cable in.

The people who feel gaslit over a USB-C cable not working would probably also feel gaslit over buying the "wrong USB C cable", because they bought a "USB C cable" and "USB C should just work, why do I have to remember which of 12 different connectors my computer uses, I thought the point of USB C was a unified connector".


> The people who feel gaslit over a USB-C cable not working would probably also feel gaslit over buying the "wrong USB C cable",

They might feel deceived or mislead by the packaging, but they won't have to wonder if they are somehow using their computer wrong. Particularly for people who lack confidence with computers, this means a lot.


While I can understand the desire to make it easier for non-savvy computer users, I don't think it's worth destroying the technical gains USB-C brings with it. Again, we've had non-savvy users hook stuff up wrong or destroy things even when ports had differentiated shapes. A software solution to identify the incompatibility and notify the user seems more practical than redesigning the connector for every evolution of USB-C.


If at first the cable doesn't fit, apply more force.


Better: add a color band to the connector. One line for low power, two for medium, three for high.


The problem is you have:

Power: 18W, 30W, 45W, 60W, 100W, 240W

Data: No Data, 480Mbps, 5Gbps, 10Gbps, 20Gbps, 40Gbps

You also have a potential variety of modes that could influence your ability to achieve those speeds with certain devices.


That sounds like two markings, not too hard.

Also there are no power ratings below 60W, and I don't think "no data" is a valid cable.


USB-PD supports 5, 9, 15 or 20v at between 0 - 5 amps. Higher power typically needs a thicker gauge wire.

There are definitely "charge-only" cables being marketed.

Two markings with each having multiple variations. You will have a dozen plus combinations, that will only get more complex.


> USB-PD supports 5, 9, 15 or 20v at between 0 - 5 amps. Higher power typically needs a thicker gauge wire.

Every voltage below 20 maxes out at 3 amps. As far as cables go, 60 is the minimum watts supported.

> There are definitely "charge-only" cables being marketed.

Are they valid though?

> Two markings with each having multiple variations. You will have a dozen plus combinations, that will only get more complex.

I don't really care how many combos there are. I won't need most of them. Most of the time I just want speed or power, sometimes I want both, and it's always a minimum requirement rather than anything specific.


>Every voltage below 20 maxes out at 3 amps. As far as cables go, 60 is the minimum watts supported.

Perhaps for PD 2.0/3.0 but there is also legacy USB PD 1.0 and USB-C BC. During that era I believe 1.5A was minimum.

>Are they valid though?

Probably not, looks like USB 2.0 data is minimum required.

>I don't really care how many combos there are. I won't need most of them.

Just because you don't need specific combinations, doesn't mean you won't have to wade through all possible combinations to find the cable you're looking for.


By the time the relevant committees have worked over that idea, there will be fifteen bands in colours that you need tetrachromic vision to differentiate between.


Can’t do color as people are color blind. Need to do it by number of bands.


Didn’t Europe ban this?


jquery was downvoted but I recall articles like this (https://www.theverge.com/2020/1/17/21070848/eu-apple-europea...) with the EU trying to ban any "non-standard" connector.

Maybe non-standard connectors are actually a feature, not a bug, because everything having the same connector but not the same functionality will lead to confusion as described above.


I don't think it was ever true that the connectors on the two devices determine whether they're compatible.

You can find all sorts of different connectors on a cable to connect two devices together, but you still only knew that devices are/are not compatible by trying to plug them in

At least with the current setup, you are unlikely to burn out the devices by trying a cable.


This does seem like it would be helpful. They started coloring the USB-A ports blue for those that were compatible with USB 3 high speed data. I think the challenge may be to get people to understand it more than being afraid to plug blue in to black. I know quite a few people just assumed the color choice was a design choice and if the cable fits, plug it in.


My Oculus Quest works fine with a 3meter AmazonBasics USB cable that I got for like $15. I think the only advantage of the official link cable is that it gets you 5m without needing to add an extender to a 3m cable


It works but it if you run a high bandwidth game like Flight Simulator you might hit some performance limits, you can check your cable speed in the quest desktop app.


I’m not the same user but I haven’t had any problems with my cheap usb cable. Only that the cheaper cable I got is shorter.

I haven’t done the speed test in a while but last I checked I think it was 3Gbps? Not quite the theoretical max for the port/cable but easily enough for the few hundred megabit video feed.


Yeah the Quest even added USB 2 support.


Welcome to the world of USB-C! My general approach, and recommendations to others, is to purchase only Thunderbolt 3 cables.

Yes, they're more expensive. But they'll essentially handle everything you throw at them, and behave as expected (due to the large protocol support [0,1]).

Of course, this won't always be the case. But it's generally a safe assumption for now.

[0] https://www.intel.com/content/www/us/en/products/docs/io/thu...

[1] https://thunderbolttechnology.net/sites/default/files/Thunde...


Spending $65 on a 6 foot cable as a "fix" for that doesn't seem like much of a fix at all.

https://www.monoprice.com/product?p_id=24721


> "My general approach, and recommendations to others, is to purchase only Thunderbolt 3 cables ... they'll essentially handle everything you throw at them"

Is this really true? My Thunderbolt monitor (LG Ultrafine 4K) came with two cables: Thunderbolt 3 (for connecting to Thunderbolt laptops) and USB-C (for connecting to iPads etc which don't support Thunderbolt). Why would two different cables be needed if the Thunderbolt 3 one can do it all?


Some active thunderbolt cables don't properly support USB. A good one can do it all, as far as I'm aware.


The cables designed for Thunderbolt 3 + 100W charging don't seem to be capable of the fastest USB speeds in my experience. That includes Apple's expensive Thunderbolt cable. They fall back to slower speeds when used as a USB-C cable.


Try finding any accessory that has USBC

Its been over 5 years already

If your phone is about to die because yiu checked out of a hotel in the morning and your flight is at night and you are just cruising around in a car, you will not be able to stop anywhere and get a USBC car charger.

So anyway LPT: get USBC accessories on Amazon in advance and dont rely on them to have any particular feature of quality control.


I've seen USB-C cables for sale at most gas station convenience stores, both C to C and C to A. I've seen lots with car 12V to USB-C as well. They're not rare around me.


USB-C sure turned into a mess, didn't it? So far, I only have two such devices (a cell phone and a set of headphones), and there's no problem charging either. But even there, we already have proprietary extensions (Qualcomm quick charging, for instance), and then there's Thunderbolt, and now this.

I'm not enthusiastic about wading into the world of "every port looks identical but isn't" that USB-C has given us. I have to keep my cell phone charger cable with the charger at all times, because it's my only cable compatible with Qualcomm quick charging. Naturally, there's no visible indication of this.


I don't know, the flipside is that Thunderbolt delivers genuinely cutting-edge interconnect capabilities. I can run two 4K60 displays and a dozen peripheral devices, and charge my laptop, all via a single cable off of my $100 Thunderbolt 3 dock. There is no other technology that comes close to this capability at this price point.


At the same time, I have no idea if the cable I have is a thunderbolt or usb-c cable. There is no standard marking.


Thunderbolt cables generally have a lightning bolt mark, and are expensive and short and thick+inflexible. And because of that, you likely don't have any except for the one coming out of a thunderbolt dock or external GPU enclosure etc.


My apple cable I believe is thunderbolt but I'm not sure. It's white with no marking.


If there's no lightning bolt marking, it's just a charge & USB 2.0 cable, not thunderbolt. Genuine Apple Thunderbolt cables are marked, and the one you got with your MacBook for charging isn't one.


Does it have frayed rubber isolation around connectors and randomly along the cable? If yes, then it is a genuine Apple (tm) cable. If it has solid isolation ad looks good then it's probably a Chinese knockoff. :)


You either have a fake cable or a USB 2.0 charging cable


You might not know in advance which is a problem but it seems pretty trivial issue. Just test it and see if it works, if it doesn't return it and get a different one. Then repeat the process until you get the one you need.


Or you could look at the cable from the getgo and read a simple marking and not have to do this song and dance.


Your Thunderbolt cables don't have the lightening bolt logo on them? I didn't think it was a 'requirement' but every TB cable I've seen always has the thunderbolt icon.


Not my apple one.


Then it's either not genuine Apple or it's not Thunderbolt. This is what an Apple Thunderbolt cable looks like:

https://www.apple.com/shop/product/MD862LL/A/apple-thunderbo...


That's not a USB-C cable. The discussion is about USB-C cables capable of thunderbolt. Not the older, dedicated thunderbolt cables.

Correct link: https://www.apple.com/shop/product/MQ4H2AM/A/thunderbolt-3-u...


Good catch - still has a lightning bolt on it though.


Which made me realize…a lightning bolt symbol could just as easily be interpreted by laymen as “charging only”!

What a mess…


So…. with a lightning bolt then.


The standard MacBook charger that comes with any Apple laptop is not thunderbolt.


do those screens need to be 'thunderbolt' as well? or can they just take a usb-c connection (but ... not be thunderbolt?)

I have a 2019 MBP 15". Every 'dock' I've looked at seemed to indicate that "you will see mirrored screens on multiple displays" - which is not what I'd want.

Perhaps all of this is because I'm 'only' using a 2 year old MBP, and this is somehow all Apple's fault?

Would love to know what specific $100 thunderbolt 3 dock you have. It seems to be a confusing mess of half-information whenever I go to shop for stuff.


> and this is somehow all Apple's fault?

Unfortunately yes. Apple (or MacOS specifically) doesn't support DisplayPort MST, which allows using multiple displays over a single DisplayPort connector. Since non-thunderbolt usb-c video is just DisplayPort, that means many usb-c docks with multiple display outputs don't work. Now why MacOS supports multiple video outputs over Thunderbolt and doesn't support MST is beyond me, but everything else does.


No, the displays don't need a Thunderbolt controller (and I do get 3 separate displays - mirroring would be pretty useless, I agree).

I use this dock: https://www.amazon.com/Lenovo-Thinkpad-Thunderbolt-Dock-40AC...

After updating the dock to the latest firmware, I can run two 4k60 displays, although it requires a specific configuration: one display on one of the DisplayPort lanes, and one on an active USB-C to DisplayPort cable hooked up to the dock's unpowered Thunderbolt port.

I've done this setup multiple times with this dock model, and it has worked with every Thunderbolt 3 MBP I've used.


It's not _terrible_, but could definitely be improved.

I'm fairly content with my current situation. My phone has 'WARP' charge, which fully charges it in less than half an hour, however it can also charge (at a slower rate):

* Earphones

* Shaver

* Handheld Fan

* MacBook

I rarely connect my devices to a display, but that is supported with my cable too. The only device I use semi-regularly that isn't USB C is my Bose QC35s, but they last ~20hrs so usually last a few weeks due to my low usage.

On the other hand, my Mac charger can charge my phone (and obviously Mac) but none of the other devices...


After only reading the headlines promising the bright future of USB-C, I am very disappointed now that I carry three USB-C in my backpack for different use-cases.


Can you explain the differences between them? I don't have many USB-C devices (Switch and MacBook) but wouldn't you be able to use the most "fully featured" cable for every scenario?


The most "fully featured" cable with a USB-C tip is often like <0.5m long.


And $40+. Buying a long one for a music recording interface (that sweet sweet I/O) was super painful.

I guess it's cheaper now but in what world is a $50 cable to power an accessory reasonable in any way.

Source: https://www.sweetwater.com/c1225--Thunderbolt_Cables


It's kind of funny when (pre-covid) at the office people would point out my dock TB/USBC connector and how thick it was.

It carries probably 100W+ for power, multiple 4K displays, ethernet, half a dozen USB3 ports... I'd rather it not be the thickness of a 5W charging cable.


which perfectly ecapsulates the "problem" with usb-c: everybody wants a cable with all the features, but only wants to pay for the cheapest cable. and then gets frustrated when their $3 cable doesn't support thunderbolt and high-power modes.

you can get 3-metre thunderbolt cables if you want, you just have to pay a lot for them. all the problems with USB-C can be solved by spending more money. if you want to go the cheap route and buy cables that don't support all the various modes, then you're going to end up with a bunch of cables that don't support some modes and have to keep track of which ones those are.

if you regularly use devices that need higher-spec cables, it's probably worth keeping one good high-spec USB-C cable in your bag, instead of three terrible ones.


Quick Charging 3.0 is more-or-less a hack and shouldn't be done on usb-c (unfortunately, cheap manufacturers still do). Quick Charging 4.0 is just USB-PD, so a different name but compatible.


Are there cheap devices that measure USB cables? So that user plugs the cable into it (both ends) and it will list what protocols/bandwidth does it support. And electrical resistance of course.


Maybe with an attached labelling machine to print the results in a way that could easily be affixed to the cable? ;) Oh, and please with an integrated wirecutter that automatically destroys cables if their results are too bad?


Heh I had a cursed micro-USB cable at the office that only carried power, no data - had fun debugging why my board did not work.


Sometimes those are useful. I've got a Canon camera which can charge by MicroUSB, but if it detects any data it doesn't engage the charger and does data only. So having a power only cable is the only way to charge it without taking the battery out and putting that on a charger.


My micro-USB charging cable is worse than that. It appears to transfer data but it always flips a few bits. Quite surprisingly, no part of the protocol stack catches this. The cable quietly delivers slightly broken files.


I've seen a lot that are JUST good enough for android debugger to connect, but then drop as soon as data is sent.


It seems that half of the micro-USB cables I get only work for charging, not for data. When I find one that does reliably connect I label it and then hide it where even I can't find it.


A 5-pack of reasonable quality micro-USB cables is less than $10, FYI.


These charge-only cables are supplied as standard with cheap chargers.


https://www.chargerlab.com/category/power-z/

These read out the e-marker and tell you the supported power delivery and USB data speeds.

The irony that their product comparison table has 30+ rows should not be missed!


How much information about the cable is exposed to a computer's interface, even if nothing is plugged in on the other side? Could this just be a computer program, albeit one that may only work with certain controllers?


I haven't had dealings with these new USB cables yet. Can anyone enlighten me what is happening here?

In older USB cables you used to have four, five, or nine pins that were directly connected via copper wire to the pins on the other end. With the exception of charging-only cables that didn't connect the data pins.

Is the issue just that the new cables tend to only connect a subset of the 24 USB-C pins?


USB-C cables can have all sorts of logic embedded in the cable, including inline resistors to signal capabilities.

However, when it comes to power delivery. For both old style USB and newer USB-C, the thickness of the copper matters for how much amps the wire can carry.

IE apple has a 30W charger that comes with a different thickness than their 61W charger. And using the 30W cable with the 61W charger.

https://support.apple.com/en-us/HT201700#usbc

> For the best charging experience, you should use the USB-C charge cable that comes with your Mac notebook. If you use a higher wattage USB-C cable, your Mac will still charge normally. USB-C cables rated for 29W or 30W will work with any USB-C power adapter, but won't provide enough power when connected to a power adapter that is more than 61W, such as the 96W USB-C Power Adapter.

The best part is that the cables look nearly identical with some very small print on the cable that says they are different.


  > The best part is that the cables look nearly identical with some very small print on the cable that says they are different.
If you're lucky, that is.

One would think that after nearly 2 decades of USB cable confusion the standards bodies and vendors would make an effort to make the cable identification easy, but no.

I suspect it's because they actually want consumers to end up buying lots cables and the churn it causes.


As a solution by anecdote:

My (by now) ancient laptop simply won't charge (or boot if cold) if the wattage of the psu is insufficient. This will easily solve that problem as you find out when the battery runs out of juice. I noticed this as it originally came with a 45 watt charger but after a processor upgrade the required power would be at least 60 watt.

Note that it will charge the battery when off with any charger, it is just slower.



That's a great idea, really!

I guess it is not completely trivial though, and would probably end up costing a fair bit due to the max bandwidth etc.

Just measuring resistance accurately enough for short cables for all conductors sounds hard.


In ideal world it would be part of spec of USB host that it can test cable and report to user. That would not be expensive at all.

Resistance is most important for power connection. To determine if voltage drop is acceptable does not need high accuracy.


Wow, the tool that I didn't realize I wanted. Let me know if you find one. I don't have an inexpensive one.


At least USB-A receptacles were coloured... did we really need to descend into hell for a slightly slimmer reversible plug that wears out even faster than USB-A cables?


Not to mention they're significantly thicker and wider than the micro-b socket they replaced, so many tiny products will continue shipping with micro-b indefinitely.

Also I've destroyed probably half a dozen of them by accidentally stepping on or rolling my chair over the connector and smushing it flat.


The little ones broke too often.


there's also the very real danger of identical ports bricking devices. See the nintendo switch issue where some users have bricked their devices due to nintendo's implementation of USB C. This limits the chargers you can use and nintendo may well deny a warranty claim if a third-party charger was used.

Not the best source - https://old.reddit.com/r/NintendoSwitch/comments/87vmud/the_...


Not true, the bricking was from was blatantly broken chargers putting 9 volts onto low-voltage signal pins.


that's one explanation but there is another that is relevant to this thread - Nintendo diverging from the USB C standard implementation. End of the day, standards exist for a reason and nintendo diverged from the standard

https://arstechnica.com/gaming/2019/08/heres-why-nintendo-sw...

If the port fails open—meaning pins just don't make electrical contact—there's usually no real harm done. But if they fail short—meaning pins are bridged electrically to pins they have no business connecting to—you may easily overvolt a pin. Remember that 6V absolute maximum rating on the Configuration Channel of the Switch's USB-C PD chip? Well, it's only 0.5mm away from the VBus (main power line), which carries 15V.


Is there a possible outcome where cable components become cheap enough that every USB-C cable is a universal cable that does everything?


Not if you want cables that are longer than two feet. There's a tradeoff between speed and distance.

And that's after you solve the problem of people trying to cut out a couple pennies of copper. And the people that want thinner cables just for charging.


No, since Thunderbolt needs active cables with amplifiers very quickly, which is driving the cost of TB3 cables well over 50$ for distances that a standard 15$ USB3.1 easily beats.

Physics is a problem here.


As long as there’s a penny to be made by shaving component costs, probably not.

I have USB micro cables that “work” to charge phones or for data but will give voltage throttling errors if used to power an RPi3 or later. This is presumably from using wires internally of too small gauge. I can’t see things like that stopping.


It's not pennies - 0.5m of TB3 cable is 50$. 2m of TB3 cable is 130$. 2m of USB3.1 cable is 15$. Massive difference, don't you think?


By pennies, I meant if someone can sell a substandard (but "mostly works") cable of any variety and have the parts cost $0.10/meter less, the market will ensure that happens.

Was comparing TB2.995 vs TB3, not TB vs USB.


No, because they will keep adding features at the high end as the previous high end features get cheaper and new high end features become possible


No because vendors will keep putting custom things into their cables to support magic XYZ feature.


> won’t need to be equipped with an ugly barrel jack

Please bring the barrel jacks back. They were virtually unbreakable. I break about a USB-C cable a week because the connectors are super weak and supported only by 2 dots of solder on a PCB instead of a solid cutout on the housing.

USB-C should have at least standardized the shape of the rubberized housing around the connector, made a matching inset shape on the equipment, and have that take the brunt of all mechanical stress, just like an IEC power cable, which you can drop bricks on and they won't break.


I know design and all this, but I would prefer red, blue, green plugs and corresponding colors on my laptop to make it easier what I can plugin where - with my desktop I which has around 10 USB ports I always struggle to find the right one.


Will people ever stop complaining?

If it started making physically different cables or sockets to add new features, people whine how nothing is compatible anymore and if it stopped adding new features to stay compatible, you call the standard dead and not forward thought.

Instead it's gradually upgrading the standard with the same physical socket, so that at least some compatibilities are kept and it's still moving forward.

Can it be better than that?

A bit of confusion on which version you need is better than not being able to plug it in at all.


I'd much rather have this and all the fallback than "well it could work but they put a little nub on the port so it doesn't fit."


Meh, I know it's a bummer and have had problems with this too. However, HDMI, DP, USB-A all have this problem to some extent. (E.g.: my graphics card does not work at-all (as in not even let the computer to boot) if an older HDMI cable is plugged in.


Ooh! Will some of them unexpectedly fry the host, the client, or both, unless you spend hours perusing through spreadsheets curated by some random hero on the internet too?


Don't forget super speed, super speed+ and super speed+ 2x2


if only we could go back in time and warn ourselves about usb type-c. the future seemed so bright


We're frankly headed towards a USB type-D standard, where we intentionally create different ports for different uses. A dedicated power connector should never a data connector, and there should be a way to differentiate A/V ports from periphery ports. The mess we are in now doesn't really work as intended.


USB-C bashing threads represent the 'HN echo chamber' for me. :)

I understand the problems of different cables, I've run into them already. One cable can do Thunderbolt, the other cannot. One cable transmits 4K video signal, the others doesn't transmit anything. And so on.

However, it is sooooo much better than anything before because at least _most of the time_ stuff works. Chargers charge. Sometimes slower, sometimes faster, but things mostly work.


> USB-C bashing threads represent the 'HN echo chamber' for me. :)

I don't understand how you can possibly call it an "echo chamber" when there are (1) reasoned arguments (2) supported by facts with (3) dissenting opinions. That's almost the opposite of an echo chamber.


Most users don't notice or care but if you just read HN you'd think USB was controversial.


It’s because virtually no one outside of Hacker News seems to have these problems. The rest of the world seems to love the singular port.


HN would rather have something that works 100% or clearly fails, so that we can understand what's going on and fix it if it's not right. The rest of the world would prefer something that works in degraded mode (e.g. I plugged in the wrong cable and my monitor started displaying at 30Hz - I was horrified, but how many "normal" users would even have the vocabulary to describe what the problem is?).


Wait until you have to do tech support for someone who has a USB-C charging cable that doesn't work connecting their hard drive. Singular cable wasn't in the specs.


Any compliant USB-C cable must support USB 2.0 no matter what. If a USB-C cable isn't working with 2.0 data, then it's a noncompliant cable.


Good luck explaining to your uncle that his USB-C is non-compliant. He just knows that with USB-C you can't even trust the cables anymore.


I’ll take that hypothetical problem over what we had in the past. A dozen different plugs, a dozen different cables, multiple required docks, incompatible charging bricks, proprietary plugs, and more.

I have yet to come across an issue with USB-C either personally or externally. But maybe some day it will happen. And I still won’t care because it’s better than it used to be.


hypothetical problem

No, just another day in IT. I'm hoping they get their crap together eventually.


At this point, I would assume that any user that buys a hard drive (vs using “the cloud”) is advanced enough to figure it out.


>USB-C bashing threads represent the 'HN echo chamber' for me. :)

It is still a rather new phenomenon. Most of HN used to be USB-C supporters. Especially those on Apple camp. For years mentioning every single problem listed in this thread would get downvoted into oblivion.

Since most of them didn't bother jumping in to defence their beloved USB-C, I guess they changed their mind.


> didn't bother jumping in to defence [sic]

Why bother? USB will remain dominant and there's nothing to be gained from the millionth iteration of the argument.


> USB-C bashing threads represent the 'HN echo chamber' for me. :)

If only they had proper labeling and specs, all of this could be avoided. If you make all cables look the same, why on Earth would end users believe they are different?


And if you don't enforce your logo IP use. All sorts of random 2-wire under-spec power-only cables and forbidden male-female extension cables have the USB logo on them. Look at this obviously wrong statement from the article:

“All EPR cables shall be visibly identified with EPR cable identification items,”

No they won't! Just like all the other stupid confusing and incorrectly implemented "rules" for labelling and orientation and everything. Even if they do it consistently, nobody will know which of all the many confusing logos means what and for some reason OSs don't show the user which component is incompatible in which way so you'll be happily enjoying 100W from your 240W charger unaware it's not actually 240W.


I genuinely don’t understand how I’ve managed to be so lucky. That is, I absolutely do not disbelieve the many tales of woe I hear on HN (like the many in this thread) - but I’ve literally never put any thought into what cable I plug in to what device, and had no trouble that I can recall. It really had lived up to its hype for me so far.

My USB-C devices are: a wireless charger, two MacBooks (one intel, one m1), a Pixelbook, Nintendo Switch, Oculus Quest and Quest 2, iPad Pro, and the charging case of some earphones… I think that’s everything. Oh, and my partner’s phone and headphones too.

Anyway - I’ve cables and chargers dotted around the house, plus some A-C ones for use with power bricks - and never had any grief powering/charging any device from any of them. What am I doing right? I’m definitely not only plugging Apple devices into Apple cables and so on.


"What am I doing right?"

Maybe not buying the cheapest cables available on the bottom of the Amazon barrel.

But I can't know that. Maybe you are and your place will burn down next week. Who knows?


> Maybe not buying the cheapest cables available on the bottom of the Amazon barrel.

That is definitely true. Given how many of my devices came with their own cables - presumably decent quality - I haven't needed to buy any 3rd party cables.


Its really not as big an issue as the comments make it seem. Of the devices listed, I can only think of Oculus Link to a Quest 2 using USB3 speeds in a noticeable way.


I don't think the issues normally present when just charging--I'd say charging is the simplest task. But when I tried to plug a USB C monitor into my mabook the other day I went through every C cable in the house to no avail, despite the fact they all do USB C PD just fine.


For Alternate Mode to work, only cables with high-speed lanes will work (aka USB 3.0 cables). Power Delivery doesn't need these high-speed lanes at all.


I figured it was something like that :) but that does nicely illustrate the problem of C: despite the fact that the ports are all the same, there's reliable way to know if a cable will work for the purpose I'm going to use it for.

Sure, basically any cable can _charge_ a device at some speed, which is absolutely my common application, but for shoveling bits around it's much more complicated.


Same deal. I've switched everything I can to USB-C, and so far had literally zero issues.

Obviously my phone charger will charge a laptop more slowly, but I had that issue with micro-USB and Kindle chargers that couldn't charge a phone faster than it used the power, so hardly anything new or unexpected.


I noticed that a couple of days ago when I was able to charge my Surface Laptop with my iPad's USB-C charger. While Surface Laptop already has a proprietary charging port, it also supports charging through USB-C. Seamlessness of the whole experience was very impressive.


Using USB-C for charging is not really the highest bar to clear. High res video would be, for example.


I would bet anyone with a use case that isn’t charging is enough of a power user to understand that some cables are different than others.

For everyone else, their life is all wireless and cloud based, and they just want to charge their devices.


I discovered my camera doesn’t charge unless it’s USB type A to USB type C. I think that’s the only weird thing I’ve encountered in my USB experiences.

It’s the paralenz dive camera. There might even be good reasons for it, but I couldn’t guess.


I have a camera on a Ronin SC gimbal. The gimbal needs to connect to the camera via two USB-C ports using a specific cable (at one end).

Still generally stuck with an array of cables (hard drives, charging cameras, headlamps, legacy iPads, etc).


Plus, if you thread the cable through your hot dog, it will be warm in time for lunch.

Meanwhile, Apple's M1 chip is showing us that 240 watt laptops are the problem, not the solution...


Having one USB-C dock powering your laptop (maybe with a hungry GPU), as well as a couple of monitors with everything through just usb-c cables is my hope, and this is one step closer.


At which point do we change assumptions about safety of USB cables though?

Myself and most people I know always considered USB connection to be safe-ish - that is, you can keep the cable connected on the supply side, and have the receiver end just lie on the desk on the floor, and the worst that could possibly happen is some tiny sparking if the stray end touches something conductive in a very unlucky way. But the more power I see pushed through these cables, the more I start to look at them as live wires hooked to mains power.

Additionally, such wattage sounds like a serious fire hazard if the cable is damaged, which means the cables themselves need to be handled with care. Something that wasn't the case with typical USB charging until recently.


I mean, I share that worry, but isn’t the “-PD” part of “USB-PD” a negotiation step?

I have very little worry that my carpet can accidentally negotiate 60w and up.

I _am_ worried about cheap USB-PD devices that forgoe this negotiation as it is complex and expensive to implement.


> I _am_ worried about cheap USB-PD devices that forgoe this negotiation as it is complex and expensive to implement.

Yes. Moving critical safety limits into software? https://en.wikipedia.org/wiki/Therac-25 Let's hope they're all up to the challenge ...


How would power negotiation be implemented without software?


There's the cheap devices but even more worrisome are cheap, possibly not up-to-spec cables.

If you buy a fake cable that pretends to have display-port alternate mode or something but doesn't, meh, you're out 10 bucks. But if it pretends to be able to carry 200 W but isn't and burns your house down instead, you might be out a tad bit more...


Should be detectable though, at least in many circumstances.

USB-PD has bidirectional communication, so at least in theory both ends could know how the cable is performing by comparing voltage and current measurements at either end.

If the cable drops too many Watts, the load can be disconnected.


Is this actually a standard, or would that have to be implemented like some sort of "extension"? And if so, wouldn't that break down the "universal" nature of the beast? As in a Dell laptop won't be able to communicate with an HP power brick (unless said extension becomes at-least de-facto standard)?

As I've commented elsewhere, I wonder if, at that point, there still is that much of a point in sticking with USB-C for what seem like fairly specific applications.

I mean, if we're at the point of carting around such a monster of a laptop, will an additional, dedicated, power-only connector make that much of a difference? Don't get me wrong, I'm actually somewhat tempted by such a thing, but just because it would be easy no move around, not for actually carrying on my back all day every day.


It's not part of the current standard[1] as far as I can see.

There are some mitigations though, like the cables must identify what they can handle (else the high current modes can't be used), and both ends needs to verify the plug and cable etc.

In theory though I don't see why they couldn't include this in the standard.

[1]: https://usb.org/document-library/usb-power-delivery


Now you will run into WiFi type problems where your cable gives you different charge speeds depending on the season and time of day. Service will degrade and you may not notice the problem and may never figure it out in the end.


Beats burning down the house though... Also in case of laptops, mobiles etc, the device can report the error.


Your carpet cannot negotiate a high power, but two devices might negotiate a high power while not being aware that that the cable has a cut in it and is in direct contact with the carpet.


Yeah that's the problem. I've had a bent usb c (from a pretty good brand) literally melt into my hands. I'm so glad I was using the phone at the time, I don't know what could've happened if I wasn't there. The dent seemed pretty "small" too, and it was way less damaged than a lot of lightning/micro USB cables I've seen and used in the past.

What I don't understand is how the cable doesn't have a way to detect a short circuit. I'd imagine that a 250 watts capable cable would have more safety features hopefully though.


I don't think the non-USB-c cables have any more safety features to prevent that. I'd be surprised if the USB-C ones do.


The original USB standard was 5V 0.5A, which wasn't going to start any fires.


Oh, I realize that I left out important context. I didn't mean relative to other USB cables, but rather relative to other non-standard laptop power cables. This goal is to replace those high power cables, and those did not have significant safeties against damaged wire.


>At which point do we change assumptions about safety of USB cables though?

It's a good question. Generally speaking, DC is considered a shock hazard at or above 60V, but OSHA recognizes 50-60V as being potentially hazardous. It's certainly an arc hazard when disconnecting as the article notes. And 5A at just about any voltage will start a fire in case of a short circuit.


I'm not 100% sure, but I believe a USB-C cable connected to a USB-C charger is cold by default until something is connected. Don't try this at home either, but it should also be perfectly safe to plug a USB-C cable into two different wall chargers.


This is my reality today; My personal set up at home is completely centred around a Caldigit TS3+, to which I have connected ethernet to my local network switch, a keyboard, mouse, headphone amplifier + two monitors.

Feeding the Caldigit dock I have a thunderbolt cable going to my laptop (a 2019 Macbook Pro) and i have a second thunderbolt cable run from my desktop (A HP Z4 with a thunderbolt card) which i can swap in at a moment's notice if I need more horsepower or want to play games etc.

Thunderbolt for both computers enables a single cable setup. It really is super convenient


It's already possible. I have a 2019 Intel MacBook with AMD graphics. It connects to a CalDigit T3 Plus dock via one USB C thunderbolt cable. The dock connects to two 1440p 144hz monitors, a keyboard, speakers, webcam, USB microphone, and Ethernet. The dock powers the laptop over the thunderbolt cable.

I also have an Intel desktop computer with a thunderbolt port. I'm able to switch from my desktop to my laptop with just one cable.

It still has a few rough edges, but overall it works better than anything else I've tried.


Meh, I don't know. This seems a pretty niche usage. You'll need a specific USB-C dock to power such a laptop anyway, so why not go the route of the Apple thunderbolt displays, with a dedicated power connector run through the same sleeve as the data one.

In practice, I find the loss of convenience compared to USB-C negligible. You still only have one cable hanging around.

Plus, as those are PCs, said connector wouldn't even need to be something specific, I suppose a random (big enough) DC barrel plug would do and be compatible with different manufacturers' products.


I had to reread your comment a few times, and then I figured out why it was so hard to parse:

For me, my monitor powers my laptop.

More specifically, my two 32" 4K monitors are plugged into the wall, and then both have USBC cables into each side of my 16" MBP (2019). I keep my MBP power cable in my suitcase for when I travel.

I love the setup - only two cables on my desk, and there is a nice symmetry about it.


Already doing this with my 16" MacBook Pro using a 24" LG 4k display. Got a second (much cheaper) 24" 4K display, a couple of dongles, and an externally powered USB 3 plugged into the back of it.

Paying $700 for a monitor was a bit painful but I have no regrets.


For the record, the M1's GPU performance doesn't even start to compete with the 2021 laptop market, much less the 2014 one. GPUs have always been the biggest power draw in these laptops, and it's honestly no surprise that Apple can cut their power consumption down to 10w when their GPU is as pathetic as it is.


I don't think this is accurate. [0] and [1] have 3DMark Ice Storm Unlimited Graphics benchmark tested on both the M1 and the 4800U in the Lenovo Yoga Slim-7-14ARE and the M1's GPU stomps the Vega 8 R4000. It outscores even the Ryzen 5000 series iGPU.

I've seen no evidence that the M1's GPU is anything but best in class for integrated graphics.

[0] https://www.notebookcheck.net/Apple-M1-GPU-GPU-Benchmarks-an... [1] https://www.notebookcheck.net/AMD-Radeon-RX-Vega-8-Ryzen-400...


From "doesn't even start to compete" and "GPUs have always been the biggest power draw" I don't think they were comparing to integrated.


But then again Apple GPU were limited to 10W Max and it was a design decision not technology limitation. There is nothing that stops Apple putting in 16 or 32 Core GPU which would bring its GPU performance on par if not exceed market competitors.


> But then again Apple GPU were limited to 10W Max and it was a design decision not technology limitation.

No, it was a technology limitation. The reason I know is because of the M1 Macbook Pro, a device explicitly designed to reap the maximum performance benefits of the M1. It was designed with active cooling and still didn't really manage to score much better than the M1 Macbook Air.

Either way, Apple can't just magically increase the wattage of their chip and make it run faster. They had every opportunity to do that in the M1 Mac Mini and the new iMac, but they didn't. It's a very obvious limitation of the SOC's capabilities, and I honestly can't find any evidence to suggest otherwise.


>a device explicitly designed to reap the maximum performance benefits of the M1.

>Apple can't just magically increase the wattage of their chip and make it run faster.

Because that is not how it works.

The M1 GPU has a maximum TDP of 11W. That is by design. They could push it with higher Clockspeed beyond their optimal level with higher voltage, but that has other testing and reliability implication with cost. It doesn't matter whether you have a large Fan and heat sink sitting on top of or no cooling. You run at 11W Max. That is part of the design. M1 MacBook Air would allow it to run 11W for a fraction of time before heaving TDP headroom to CPU. MacBook Pro allows it to use at its maximum for longer. Since it has a 25W cooling capability.

You want higher GPU performance, throw in more Core. GPU workload are inherently parallel, the only limitation are interconnect and Memory bandwidth. Both of these are are not technical barrier but cost concern.

On a 10W GPU comparison ( And why would you compare GPU that are running at higher watts ) The Apple GPU are doing fairly well in all Metal optimised benchmarks. Compare to AMD Radeon GPU which also has been optimising on Mac platform. ( Although at a larger node )


You're right, but I don't think that the other commenter was speaking to iGPUs. Higher wattage laptops mostly have beefy GPUs sucking up that power, and you'd be hard pressed to find any iGPU that compares (including that of the M1)


Sure, but multiply that by several times (such as the leaks saying that the M1X chip will have 16 or 32 graphics cores compared to the M1's 8) and it'll still be well within the territory of the current 100W spec while handily outperforming pretty much everything else with the same power draw.


Which kind of laptops are you speaking for? M1 GPu performance Blow any integrated gpu and all middle class separate GPUs (mx250, etc).


The mx250 is not a "middle class DGPU" because there is no class below it. It's Nvidia's weakest dedicated graphics card they continue to sell, and it's a pretty terrible point of comparison. A better example would be the GTX 1060, a middle-class dedicated GPU that made it's way into many budget gaming laptops 7 years ago. It is faster than the M1's GPU.

The Mac has a very dedicated audience of video and design professionals who are going to be left empty-handed here, even if they double or triple the amount of GPU cores in the SOC.


Gaming laptops I assume.

M1 GPU beats most other integrated GPUs, but that doesn't magically make the demand for more performance go away, and that demand is likely to increase as there become more and more non-gaming applications for GPUs (machine learning, video editing, etc).


[flagged]


Do you have an actual disagreement? A 14nm Intel chip might waste an extra 20 watts, but when you get into hundred watt territory it's fair to call that a GPU thing.


The Intel astroturf isn’t even artful, it seems either bot or Amazon Turk driven.


On the other hand, I would love to use an M1 chip that makes use of 240 watts. Could you imagine the power? Give me the option to trade efficiency for horsepower. When I hear things like "My M1 mac never turns on the fans" I wonder why the system isn't clocking higher if the cooling system is running so comfortably.


For most users the M1 is fast enough that a nicer user experience (no fan noise) is more worthwhile than slightly better performance.

My friend recently purchased a MacBook Air with M1. There's no fan at all. It's incredible. It's a block of metal that just works.


I think most users would at least appreciate the option. For example, windows has power settings where you can adjust the clock of the CPU to however you like when plugged in or unplugged. Consider also that gaming has never been a silent prospect, on a desktop, or a laptop, and especially a game console. The switch is silent, but gamers know nintendo is compromising on graphics fidelity compared to competitors to make a silent handheld device, and it does get hot during use. I don't think people would mind if their fans spooled up when they are getting good frames at high graphics setting from their games, or at the very least had the option to select a more performant clock speed if fan noise didn't bother them. Fan noise doesn't bother me, I game with headphones like most enthusiast gamers. Apple gives you nothing right now like that, that I know of.


> I think most users would at least appreciate the option

The M1 Macs are the first and lowest-end Apple Silicon Macs we'll see. There's a reason they only replaced the cheapest devices with them so far - wait until we're done with the 2 year transition period and I doubt you'll have anything to complain about.


I think you're right, but it's not in Apple's ethos. As an example see the Apple Magic Wireless mouse that has its charging port on the bottom specifically because Jobs didn't want users to ever use it as a wired mouse.


For 20 years (ok, more, really), laptops have not been fast enough to do reasonable professional-grade photo editing using standard Adobe tools. The complexity of the algorithms being applied increases at least as fast as the processor power. So, put me down on the side of wishing they'd build a SKU that could be optimized for performance, rather than only power consumption.

(Sometimes a desktop is not practical, like on a remote photo assignment, but there is a power outlet.)


The Pro will be out within a few months.


> why the system isn't clocking higher if the cooling system is running so comfortably.

Because there's a upper limit on how much power (voltage, clock rate, etc) you can shove though a CPU before it starts malfunctioning or getting damaged by purely electrical effects, no matter how effectively it's cooled?

It's entirely possible Apple has set the nominal limits fraudulently low for business reasons, but there are actual physical limits here, and depending on how the CPU is designed/optimized, it's quite possible that it's easy to build a cooling system that significantly exceeds what those limits allow to be demanded of it in a significant range of cases.


Clearly there are non-computing use-cases for 240W devices that would benefit from being charged. What about power tools, or high-end game consoles?


There’s an alternate universe where all wall power outlets are USB-C with ethernet already mixed in.


I think people are missing the point of supporting this much wattage as a USB-IF standard. There are already 130W power supplies from Dell, so it's going to happen with or without standardization, and I'd rather it happen with. It's also not just laptops that might be powered by this. The article mentions all-in-one computers like Mac might use this, and I could imagine it replacing those awful AC adapters used on monitors with no internal power supply.


50 volt @ 5 amps? Better make sure your ports don't have any lint inside. And don't forget your special EPR certified 240W USB-C cable, of course. The U in USB stands for "universal", but it feels like USB is trying to target a very niche market with this.


5A, 50V

Good luck.

USB port just isn't physically ready for this. This is going to end with some spectacular fireworks once you factor in cheap Chinese engineering for cables, ports and chargers.


Of the 415 pages in the spec, 8 of them are devoted to arc mitigation in the "USB PD High-Voltage Design Considerations" section.

Lots of discussion about detecting unplug and limiting slew rates. In practice that will cost pennies to implement and will therefore be skipped in many designs. ("I got a great deal on this charger on Amazon!")

Time to develop some user superstitions around USB C:

"The withdrawal velocity is a factor in whether an arc will occur or not. If it is fast enough, then there is insufficient time to reach the voltage differential needed to form an arc. In practice, the withdrawal rate may not always be fast enough to keep the differential voltage below the threshold of arcing."

So, tell your informal tech support clients (family and friends) that they just didn't unplug their cables fast enough when their cables and devices start breaking. I look forward to everyone's sleight of hand moves where they unplug cables so fast you can't see it.


I have already enough talk with my family to explain which charger and cable goes to which phone.

Even more humiliating is them looking at me as if I was complete idiot after me suggesting to them that the selection of cable itself may be cause of their problems. They had some physics at school and from their point of view the cable is just a bunch of copper and it should absolutely make no difference for how fast their phone charges.

Now I have to ready myself to have those "Universal" chargers which can only ever charge a single device but with no indication as to which device they can charge or which cable you have to use for this to work.

Which is pretty ironic because right now we have a bunch of laptops and it is much easier to explain that if the plug fits the socket you are ok and if it does not you need to look for the one that has matching plug.

The only solution I see is labeltown -- dust off my label printer and have every single piece of equipment, charger or cable clearly labeled.


> They had some physics at school and from their point of view the cable is just a bunch of copper and it should absolutely make no difference for how fast their phone charges.

Even if it is just a bunch of copper, there is thin and long copper and thick and short copper. I had a cheap, long micro-USB cable that was too thin to deliver 2A at 5V, my phone couldn't charge correctly. Using the Ohm's law, I could calculate that the voltage drop was enough to get it out of specs.

The cables you need to transport power at 5V can be surprisingly thick, and that's a reason why fast chargers are usually higher voltage. I ran the calculation to power a 5V LED strip over 10 or so meters, and came to the conclusion that I would have needed utility-sized cables.


Utility-sized cables, or alternatively a separate sense wire to measure the voltage on the other end of the cable. This would allow you to adjust the output voltage of the power supply to deliver the required voltage, including the voltage drop over the cable, and your LEDs would work just fine.

At least as long as the cable doesn't catch fire from the resulting power dissipation...


You know, as I think about this, maybe it's ok to have the standard overreach the physically feasible. There's a spec if you want to use a fat cable...most products will never reach that spec. Is that so bad?


BTW, does anybody know if that's safe? The numbers I remember are an order lower (24V, and less than 500mA).


They want 240W. 5A times 50V is 250W. It means that you can't have both voltage and current lower, you can only lower one by increasing the other. But both higher voltage and higher current are a problem.

More voltage will require larger clearances and better safeties (like thicker isolation on cables).

"In industry, 30 volts is generally considered to be a conservative threshold value for dangerous voltage. The cautious person should regard any voltage above 30 volts as threatening, not relying on normal body resistance for protection against shock." https://iastate.pressbooks.pub/electriccircuits/chapter/chap....

Now, if the voltage is dangerous, it stops just being problem for the device but becomes safety hazard with all implications.

On the other hand increasing current is not without problems. It quickly requires thicker cables and traces for which space is just not available in tiny USB C connector. If conductor is not thick enough the result is heating and possible melting, degrading the material over time which could lead to shorts.


I bet people downvoting you don't understand physics... :-/


Shh... it is not in good tone to directly demask shortcomings of general HN audience. Everybody here is an expert in every topic discussed and if you got downvotes it means you have deserved it.

I personally own an electronics lab but whaddaiknow.

On more serious note, the USB-C just barely has dimensions to deal with 5A 50V. You need thick enough cables and traces, you need clearences and you need margins.

If you look at breakdown voltage for USB C connector, it is not a lot above 50V. Usually, you would want many times your working voltage. It just begs for a tiny speck of dust or condensation, manufacturing fault, bent connector, etc. to cause bad day for the owner.

Now, in a properly designed device, it technically should immediately detect the situation and cut the power. But there are two catches. One, is that at that power the short might be just about right resistance for this safety to treat is as valid. Second, this assumes properly designed device. If you cheap out on silicon you use for your charger it might just not have the capability to stop it before it gets to far and melts a bunch of stuff.


Reminds me of those lovely "molex" to SATA power connectors. And in my marginal lay persons mind they should be superior for transferring currents. And still some of them don't fare exactly well, with lower current draws...


Does USB-IF include wire gauge / conductivity requirements in the standard ?


I still don't understand the point of all this... What is wrong with simple barrel connector... It's not like a phone is gonna need to draw 240 watts...

Just keeps making the spec more and more complicated...

Also are the small contacts in a USB-C connector even reasonable to run with this much current? Is there enough cross sectional area on these contacts?


How exactly am I going to charge my phone, headphones, gamepad, tablet and bunch of other devices from a barrel connector?

USB-C has pretty much eliminated forlorn MacBook users shambling around our office and looking for another soul to give them a compatible proprietary charger. Let's keep it this way.


None of those devices you listed need 240 watts... That's quite a bit power to just trust some protocol to negotiate correctly.


And yet they will charge from the 240W charger all the same. Which is what makes the standard great.


I use my MBP charger for my iPad and Android phone and it works well. Am I in the minority for doing this? Maybe, but my guess is not.


What if the rest of the devices charge through the macbook at once?


Powering 90% of the things in your house via DC power should be the future, no more switching from DC solar/battery back to AC and then back to DC again for no reason.


The reason is that AC can be easily transformed to very high voltages which allow power transmission over long distances with lower losses. I doubt this is going to change any time soon.

Of course, you could still have central DC conversion in your house (or maybe even neighborhood) and use USB or similar in your home’s wall outlets, instead of each device needing its own little power brick.


So back when the electrical grid was being created it was basically impossible to step up and down DC. So it had to be distributed at the voltage it was used at. AC could be stepped up and and down with transformers which reduces losses. Now though that we have the circuitry to step up and down DC fairly easily you can actually get more efficient long distance transmission with DC due to not having to factor in things such as skin effect. I don't think things will change quickly but I do believe we will be seeing more DC systems in the home and workplace in the future since it will also mesh with renewables better.

https://en.wikipedia.org/wiki/High-voltage_direct_current#Co...


DC is can be better over a certain distance because AC loses a lot of energy to heat. the pacific intertie is DC.


Semi related question: has anyone come across a product that would let me replace an outlet or a light fixture with a flush (clean design) thing that provides enough power for an LED strip?

I don't like having a large DC converter hanging off a switch hanging off an outlet. And I can't have a DC converter hang off a ceiling (nor do I want to wire it directly and hide inside the ceiling either).


Amazon sells Leviton (a good/common brand) electrical outlets that support USB C power delivery for ~40USD.

You can install one, and connect a USB C PD to DC cable ~20USD to power your LEDs.

Your LEDs have to be 30W or less though.


You need the AC->DC converter somewhere. It either needs to be in the light or outside.

There are LED fixtures that include the converter; you decide on what that you like the aesthetic of.


I suspect you need POE (power over ethernet), or to at least start from there.


Maybe with high voltage DC, but 5V or even 12V DC distribution is a complete no-go/waste of metal.


At 5V and your household needs: 1- refrigerator (400W when it motor is running); 2- vacuum cleaner (1000W when running); 3- TV's/PC's/light-bulbs (around 1000W in the evening when all are running at the same time) - all these amounts to at least ~2.5KW power. Now divide that by 5 and you get the intensity at 500A. Do you have any idea how thick the copper would need to be to allow 500A? a full 1cm x 1cm. Any idea how heavy that going to be? Or how expensive? That's why even for 10m distance AC at 220V is better.


Who's running their vacuum cleaner all evening?

My fridge/freezer uses 25 watts average, but the 90% of things I was thinking about was ignoring the heaters and motors, just the lights, computers, speakers etc.

Next question, why 5V, when they're talking about 50V. That brings us down to 1/10th the amps.

Also this is only inside your house, not for power distribution. Generate your DC power with solar, store it in your car/battery, and power your LED lights, Smart speakers, computers, all the stuff that converts power to DC before it uses it today.


25 watts might be average, but what is the peak and consumption during duty cycle?


Exactly my point, but @mavhc didn't get it. You vacuum for 10 minutes, but during those 10 minutes if your refrigerator motor is engaged suddenly your standard 3mm copper wires are going to become white hot and melt and you get a nice warming house fire because you didn't thought of the peak intensity. This statement assumes the 5Vcc as house power outlets.


Apple and Microsoft's Surface had (does the Surface still have?) the perfect connector - the Magsafe. The more we dig into this power over USB-C garbage, the farther away we go from that perfection. The patent expiration cannot come soon enough, I seriously hope all portable device vendors bring it back.


The leaks say MagSafe is coming back on the MBP 14" and 16" laptops coming, possibly, at WWDC next month.


I've sworn off the past 5ish years of Apple laptops, but if they bring MagSafe back, ideally along with a non-Touchbar option, that just might be enough to bring me back!


The rumors are indeed that Touch Bar won't be on the new 14 and 16in either. I won't miss it, but I certainly don't mind it.


OOTH, you'll still be stuck with Big Sur (or worse).


I'm going to be very annoyed if Apple ditches the Touch Bar and gets that close to making the perfect laptop, then takes away my USB-C charging.

Knowing Apple they would never offer both.


Magsafe were the perfect connectors, but it was a huge shame that Apple botched the cables attached to those connectors. I went through three of them in three years; they yellow within months then start to fall apart. But worse than the cables were the fanboys who were always eager to blame me for the cables failing. I've never had such problems with cables before or since. Despite liking the connector itself, I am glad to see these gone.


Surface still does have this connector, but can also charge from USB-C at reduced speed!


That's the perfect way to do it. If it's docked, you likely don't care as much about the charging speed. And if you're using on your lap or coffee table, you probably care more about the MagSafe aspect.


The Surface dock uses the magnetic connector.


There are already plenty magnetic USB cables, most will only carry power or maybe USB 2.0 though


Excited for the first ATX power supply which takes USB-C as input rather than the traditional IEC 60320 C13/C14.

I mean, it'll probably cause some fires, but it'll be exciting!


Wouldn't ATX power supply outputting USB-C make more sense? They have superior efficiency against your average power supplies...


I guess an ATX power supply accepting 240W USB-C, which is 12V, would only do a tiny bit of DC-to-DC conversion for 5V and 3.3V, passing through the rest as 12V.

And an ATX12VO power supply taking in USB-C would do… nothing?

Heh.


240W USB-C will be 48 V (the current limit is 5A as it is in the current up-to-100W spec, just the voltage max goes from 20 V to 48 V)


Ah sorry you're right, I don't know why I thought 12V.


240W through those little pins in that little plug? I don't know if I want to power that large of a laptop with such a little plug.


Pin size limits the available current. 48V @ 5A = 240W


Does USBIF spec out conductor gauge ? For 5A, I wouldn't go anywhere below 20 or 22 gauge, but I highly doubt it's gonna be the case.


That's arc welding power, 50 volts at 5 amps. I fail to see how a connector is going to last more than a few cycles under load.

Also, don't use that near anything flammable.


Arc welding is typically 50-500 amps.

Presumably this would be a negotiated system like USB-PD, where only a tiny amount of power is available at connection and with pins physically configured to ensure breaking connection in order you may be able to shut down the higher power before the main contacts disengage).


Inevitably power won't always be shut down first, or strands of the cords will break. It feels like we're going to have to learn the lessons of UL approved power cords on appliances all over again.


You can TIG weld down to 5-10A range...


... and no one will have the slightest clue what their chargers, cables and devices support.


Simple. The really bulky chargers will be suitable for charging laptops. Everything else is for phones.


If that becomes that standard to judge by, then there will be really bulky chargers from China containing lead weights with inadequate circuitry.


Isn’t that already the standard to judge by? You can already buy both small and large chargers from China.


Approximate all chargers are from China.


65W GaN chargers are getting pretty small, comparable with the crappy 5V USB power supply Apple used to pack in with their phones.


This is pretty good for the people who have such laptops, but I’m just struggling to understand the use case for a laptop. A laptop that consumes that much power can’t possibly have much battery life. At that point, are you getting much portability out of your laptop? Why not just use a desktop?


There are two use cases for laptops. The first is the normal one where you carry your laptop around wherever you go, and actually use it on your lap sometimes. The second is to view your laptop as a mobile desktop, where you have a few stations. The laptop would always be plugged in, but it might be plugged in different places.


> Why not just use a desktop?

Gaming laptops that contain high-end GPUs can consume this much power while charging the battery and running games. And such laptops are still way more portable than a desktop. You can lift the laptop and keep it in your car, drive off and have a gaming PC ready-to-use where ever you end up. Same cannot be done with any desktop.


Typically they're throttled by poor thermals. I'd be interested to know if anyone has recommendations for rtx grade laptops with nice CPUs in 15-17 inch form factor that doesn't throttle but are still somewhat portable. I have not found any but I think I've been testing too thin ones (XPS and Blade).


Have you checked out Thinkpad P15/P17? I can't speak to as to if and how (probably to some extent unless in ideal climate) they throttle, but generally Thinkpad P series are holding up pretty well (obviously at the cost of slimness). Can be configured with up to Quadro RTX 5000 Max-Q 16GB.

EDIT: Judging by notebookcheck, you should also check out HP ZBook Fury 15 G7, Thinkpad T15g, Dell Precision 7550, and MSI WS66.

https://www.notebookcheck.net/Lenovo-ThinkPad-P15-Gen-1-lapt...


There are a number of "balanced" gaming laptops that cut back GPU power a bit to accommodate a thinner, lighter build without throttling often or at all.

One such machine is the ASUS Zephyrus G14/G15 and in the next month or two, M16 which weigh ~4.5lbs and come with Ryzen 5000 mobile CPUs (up to 5900HS) and 95W variant RTX 3000 GPUs (up to RTX 3080). I've been using the G15 for the past month and it's not bad, kind of a midway between a Macbook and traditional gaming laptop. Its looks are low profile enough that I wouldn't be embarrassed to bring it into an office.

If you're willing to push the slider a bit further in the power direction, there's machines like the Lenovo Legion 5 Pro and Legion 7 16", which weigh about 1lb more than the Zephyrus machines (~5.5lbs) but come with significantly beefier cooling, a higher TDP CPU, and 130-150W variant RTX 3000 GPUs. They're a bit more of a desktop replacement but still fairly reasonable to lug around, nothing like the 8-10lb behemoth Alienwares of old.


Ok great info, gonna check for one with a huge battery. I'd like to learn more about undervolting and such to see how much battery you can save. I want to work on it on battery for long stretches of time, activating the GPU periodically to crunch or render something and then turning it off to save on power.


Have XPS 9570, can confirm that it works gorgeously for a minute or two and then throttles down to nothing. This is both before and after opening it up to clean out the fans and redo the thermal paste.


I've had some luck forcing it to use integrated graphics until explicitly told otherwise, but CPUs tend to run hot as well. I have not yet tried new Zen processors but due to power draw I suspect they perform good in these conditions


It's still a mobile form factor.

That's basically why. It's a computer you can take with you, without worry about packaging or needing to bring cables.


What's "that much power"? More than 100 watts?

I can find a laptop with a recent ryzen CPU and an RTX 2060 that gets 10 hours of battery life. At the same time, those parts could be pushed to 150 watts if you had enough cooling. And you'd need even more power to charge while using it flat-out.


I think such kind of devices are primarily portable and occasionally mobile. It is a subtle but important difference. Most gaming PC notebooks over 13" are too bulky anyway to be considered mobile devices. The only beefier notebook that still reasonably can be used as a mobile device even at 16" screen sizes are, in my experience, macbooks pro. but YMMV


I have a desktop and a laptop. Playing a game on my desktop means spending time well away from my family. Using my gaming laptop on my dining room table means I'm still around. I can answer the door, help out if need be. Makes things much easier in a busy household.


Because most desktops don't have integral screens and keyboards. Nor do they stay on and compute in an albeit degraded state in transit.

Not everyone is just using laptops as a word processor/browser. Sometimes you just need oomph.


i would love if my macbook stayed on and computed when i closed the lid. behavior is pretty inconsistent no matter what settings i check in control panel. sometimes i close the lid and it holds its ssh connections, sometimes it drops them and i don't know why.


Can't help on the lid closure, but apparently there's some screensaver setting that tells the system to keep working in the background even when the lockscreen appears after screensaver activation.


I like to be able to move a powerful computer to different rooms of my house.


I use my laptop as a portable desktop and plug it in where I'm going. I have zero use case for a laptop that isn't plugged in.


I'm having a hard time understand the negative comments here. Can some explain precisely what the commenters here know that the USB-IF doesn't?

Because according to the comments here this won't work, will cause frequent fires and is an all around insane and unworkable idea.


> what the commenters here know that the USB-IF doesn't

The fact that in the real world, in 2021, it's still impossible to tell what standard a particular cable or device supports. I have several cables in my possession, and only through trial and error I can tell you which cable supports what, and I'm lucky enough that these cables are high-quality and fail "safely" but technically they don't have to.

Yes I know that technically the computer and USB-C controller knows which cable supports what, and yet so far no consumer-grade device has any kind of UI to tell me which cable supports what. I guess I can probably figure it out using the command line, but that would solve the problem for me but not the average non-technical user who just wants a cable that works.

In the old days, you could buy a USB-A to micro USB cable and have it work and charge your phone. You could buy an HDMI cable and have it work and send video to a monitor. With USB-C, you can't know which cable supports what until you've spent hours researching and understanding how USB-C works and the different alternate modes, and even then, cables might be mislabeled and you still can't be sure until you actually try it.

All the above is at least somewhat "safe" because the worst that can happen is that the device gets damaged, but if you suddenly start increasing power levels, non-compliant cables will start burning down houses.


You're overthinking it. The average consumer will just read the big text on the package, ask a shop assistant or a friend.

You'll have different sections in shops, "fast charging cable", "fast transfer cable", or "fast charging AND fast transfer cable" (with a price to match)

Or stuff like "supports connecting a TV", etc.

People will understand that there are different cables for different jobs, because they understand that a universal cable will be way more expensive.

> In the old days, you could buy a USB-A to micro USB cable and have it work and charge your phone.

Some USB-A to micro cables charge way faster than others (thicker wire). So this supposedly new USB-C problem is actually not new at all.


> Can some explain precisely what the commenters here know that the USB-IF doesn't

Safety margins.


The next hurdle will be the TSA's 100Wh limit on batteries. It's illegal to fly with anything larger.

That's why Apple's largest 16" Macbook Pro has an exactly 100 watt-hour battery. If some laptop can charge at 150W or 200W, then presumably it would draw enough power during normal use that'll it need more than 100Wh of battery to be usable for a long time.


I welcome 240W, despite any potential arcing/melting issues, it seems worth the risk. However, it's masterful trolling on the part of USB-IF to name it USB-C 2.1.

Their naming continues to be legendarily bad, why couldn't they just call it USB 4 instead of overloading the existing USB 2.1 version?

Everyone needs to learn from the Wifi and LTE standards groups which finally abandoned 802.11acxqyerwzdfqwrty naming and moved to simple version numbers like "Wifi 6" and "5G". Simple version numbers are so much more effective at getting people to upgrade as well, because they're much easier to market to the average person.

https://techcommunity.microsoft.com/t5/microsoft-usb-blog/us...


USB-C will soon be more cable than port, girth-wise. 3.2 cables are already very stiff, impractical and quite expensive.


240W !? On what, a fraction of a mm bloody copper cable?! Whoever designed this hadn't actually encountered electricity in real life.


Lots of misconceptions about how power delivery works over USB-C. Here are a few facts that may help:

- USB-C power is negotiated over USB-PD protocol. Dumb devices are going to get 5v at low max. current.

- The sink device (laptop) must communicate and specifically ask for voltage/current it needs. Sources (power supply) aren't going to just push the max voltage to any device.

- All USB-C cables are required to support 3A of current. For any higher current, the cable must have a chip that advertises its current capabilities.

- The new spec imposes further requirement for the cable (in form of extra capability advertised by the chip in the cable). Power supplies will only enable 50v if such a cable is present.


You know, ordering quantities of USB/Thunderbolt cables is already a nightmare and now I really have to worry about this crap?

Does anyone have a "meets all the damn specs" USB-C/Thunderbolt 4 cable they recommend?


It could power non-computers too. USB is a defacto standard for ubiquitous DC power, but only for small devices.

Now we might have a standard for a broad range of DC devices.

Those USB coffee warmers might actually heat your coffee. :)


Am I the only one who feels like it’s a bit late for high powered USB-C when Apple’s M1 is putting the squeeze on other manufacturers by showing how low power and high performance can go together?

I just got a screen that delivers just 40W over USB-C that keeps my M1 MBP fully charged and it’s been a revelation. Suddenly the fact that this machine’s charger is unique in a house full of other MacBooks is irrelevant.


I used to hate USB c's varying level of support.

Then I got a laptop (from work) which takes USB-C power port, not much different than any other custom power port.

Then, I got a docking station (from work). And the USB-C cable turned out to be capable of both powering the laptop and transmitting data via the same wire to docking station.

It was awesome. Otherwise, I would have needed two cables for power and data.


What kind of idiocy is this? Apparently you can already buy cables for the BADUSB exploit:

https://sneaktechnology.com/pentest-engagement-scenario/badu...

Better carry your original cable and device with you at all times.


This kind of wattage opens up some non-computing applications like charging small electric vehicles and power tools.


I know there are issues with it, but I love USB-C. We are building a weekender campervan with my wife (don't judge) and we decided that there is no point spending money (and space) on an inverter anymore. Most things we have (laptops/phones/cameras) either already charges from USB-C (or compatible with an adapter), or a 12V charger exists. In fact we won't even hard-wire USB-C wall ports, because cigarette lighter USB adapters are constantly improving (current best ones are around 60W on a single port). We are saving a ton of money and modern DC-DC chargers are probably way more efficient and less bulky than and 12VDC->110AC->9-12VDC conversion could ever be. I wouldn't be surprised if this trend continued and we started seeing little off-grid cabins go entirely DC. It just makes sense.


When you look at the pinout, USB-C is comparable to a conventional ad-hoc backplane in terms of the number of functions it supports.

Except the backplane implementations (i.e. the cables) are not uniform, so fun times.

At least with DB25-ended cables you could wire your own, albeit without the performance range.


OK, call up the USB-IF and ask them a question. A simple question. When USB-PD was developed in the ancient year of 2014, was it truly unforeseeable that, you know, laptop GPUs use a lot of power, maybe we should support more than the arbitrary 100W number on this connector?


It doesn’t sound like 100w was arbitrary. Increasing power always means increasing either the voltage or the current, or both. Increasing the current means you need larger conductors, i.e. a larger connector and thicker wires. Increasing the voltage means mitigating arcing concerns, which was mentioned in the article.


I thought the industry was moving away from intel cpus?


GPUs in gaming laptops use WAY more power than even the hungriest mobile CPU.

Strong mobile CPUs like the Ryzen 5800H are usually in the 45W ballpark while mobile RTX 3080 class GPUs can go as far as 170W+ and that is still half of what the desktop equivalent can draw.


What's the cooling situation on such a laptop?


Somewhere between "drone" and "fighter jet" in most cases.


if you are paying money for a gaming laptop, you probably aren't tolerating the teensy unidirectional laptop speakers, and have some serious cans on your head for gaming that mask the helicopter on your desk.


Many of them use vapor chambers, they are quite good as a cooling solution[0].

[0]https://www.1-act.com/resources/heat-pipe-fundamentals/diffe...


Depends on the laptop.

Don't search for the ASUS G14, the G14 is dead when plug too much power into it. Bold statement, but I've warned you. You need power, your going to hear the fans. There's currently no solution for cooling it quietly.


I'm somewhat confused by all this.

My desktop PC has a RX5600, which I understand isn't particularly power hungry (nor powerful), but I still consider it a decent GPU. And the card, with its heat sink and fans isn't all that thick and stays reasonably quiet under load, even though I have a 135W CPU next to it. Could probably be even better if the fans were pulling air directly from outside.

At work, we have HP EliteDesk 800 minis, which are fairly slim (for a desktop) too. They usually have desktop i5 and i7s and are fairly quiet and don't seem to throttle. They do get somewhat loud at full load but probably because the heat sink is ridiculously small.

So I'm wondering, why aren't there more thick laptops? I'm not sure a laptop that requires 200+ Watts would last all that long on battery, so I suppose the main use case is an easy to carry, all in one machine, not so much a use-it-all-day-on-the-lap affair. So does thickness come into play that much? Those seem to target a particular audience, so they wouldn't even have to play the thinness marketing game against more popular offerings, they probably wouldn't even be shown next to each other.

I remember my dad used to have a laptop around 2005 in the Athlon XP days which was definitely thicker than my desktop GPU and the HPs and was still fairly usable on the go. I can't believe that a laptop as thick would generate airplane-level noise.


Difficult to give accurate answers because of: "They do get somewhat loud"

- Laptops have components that are very close together.

- The cooling system is sometimes shared between CPU and GPU.

- The other comment under your comment is also true.

- What is more tick laptops? There are "thick" laptops in the gaming laptop and business -department.

" I'm not sure a laptop that requires 200+ Watts would last all that long on battery,"

- Power/resources can be scaled up and down.


>So I'm wondering, why aren't there more thick laptops?

Because the majority of consumers have voted with their wallets for thin, sleek and sexy devices with poor cooling that are loud and thermal throttle.

If you want bulky, well cooled laptops, there are enough to choose from. XMG makes them for example.


Depends on the laptop. Can't do really big statements.


How much do you like your hair-dryer?


USB-c Power Delivery does not require an intel CPU, so I don't see a problem.


I think he was referring to the absurd amount of power some Intel CPUs require.


My SO has an AMD gaming laptop that came with a 135W power supply. Surprised the heck out of me as I typically see Lenovo/Apple laptops with 45W or 65W power supplies. Doesn't seem to be an Intel-only thing.


Then other power-hungry device categories will become the beneficiaries of this. External GPUs with a high power draw, for instance.


don't external GPUs have their own power?


Currently they do. They wouldn't need it if such a high-power connection became the norm.


Honestly, providing ridiculous power levels for laptops almost nobody will ever use feels a little lower priority than making a connector spec that is more durable. I've had multiple problems with USB-C connectors breaking (including one breaking on a monitor that ruined it), and damaged pins that actually caused a phone to be bricked. I've recently started having a galaxy ultra claim the connector is wet even when it isn't and even after I've cleaned it.

I can't be the only one having constant issues with USB-C, but I also haven't been able to verify that this is a common issue for other people so maybe I am just a reckless user of technology.


Another variable to add to mix of:

"Well its USB-C cable and a USB-C hole"

"... but no, it does not work"


> "while my own Dell XPS 15 can technically charge over USB-C, it needs 130W of power to charge and run at full bore simultaneously."

Meanwhile, my M1 MacBook Air can charge and run at full bore on a 20W iPhone charger!


I plugged in USB C once and destroyed a new Dell laptop by burning out the motherboard. The whole let's mix communication and power thing seems like a sketchy way to sell more computers and ewaste.


Does there exist a device that you can plug in USB and HDMI cables and have it tell you what features, modes, etc that cable supports? Seems like having something like that wouldn't eliminate all the problems but would make them easier to deal with by quickly identifying that the cable you tested is or is not appropriate for the task at hand.

That aside, I'm looking forward to running my laundry room off of a dozen USB-C connectors hooked up in parallel.


This is my only qualm with all the different standards. I just want some way, any way at all, to tell whether stuff works the way it's supposed to. Either label the cords and outlets (via printed acronyms, or color-coding, or cryptic symbols, I don't care), or provide the information software-side.

I can't believe Microsoft and Apple haven't built this into their operating systems. Sometimes it's near-impossible to figure out what Bluetooth standard I'm using, or HDMI, or USB, and I feel like it wouldn't be a monstrously hard problem to solve, as long as the devices are already successfully talking to each other.


If you just want to know if a usb cable can safely delivery a certain current, then a cheap multimeter might do.

If you want to test whether a cable will work with a certain protocol at a certain data rate, just plug it in and see if it works. Because the alternative method[1] costs hundreds of thousands of dollars.

[1]https://www.keysight.com/ca/en/products/bit-error-ratio-test...


This seems absurd. Is there a reason why Microsoft can't build a feature into Windows that tells you what USB spec a cable is using? (And details like transfer speed, power, etc.)


50V at 5A through a tiny USB-C connector. That's pushing a lot of power through dinky wires.

Next, people will want to be able to jump-start a car via USB-C.


What are the costs to add a USB PD module to an electronic device? https://hackaday.com/2021/04/21/easy-usb‑c-power-for-all-you...

- [ ] Create an industry standard interface for charging and using [power tool,] battery packs; and adapters


And charge other electronics much faster. Power tools, household pluggables. This market isn't sexy but it's a good time to disrupt


I mean I like simple standards, but at what point do i have to start worrying about my mouse electrocuting me because my cat chewed the cable?


I mean your cat could have chewed through your laptop cable previously.

Your mouse, if using USB-C, is only going to have the current it needs to run. So it'll be super low mw at 5v. Its not going to be doing 240w on it. USB power delivery isnt like your wall outlets, it asks and receives power depending on the need.


I'd like to see the next version of this spec go from 5A 50V to 2A 2000V.

5 Amps is getting into danger territory with cables that are bent a lot and heating is concentrated in one high resistance area. Heating => melting => fire => deadly. Fire kills 10x more than electrocution.

2000V is also well into danger territory for humans, but we now have technology to make it human-safe for a child to gnaw through that wire while in operation and avoid injury. All that's needed is for the spec to require current leakage detection. As soon as 1 milliamp or more is unaccounted for between charger and device, you need to assume it's flowing through a child's tongue and power off within a few tens of microseconds.

That requirement means you need MOSFETs at each device (to be able to power off quickly), and to have capacitance control of the cable.

Every device needs to Verify that the device on the other end and cable does these things, so that nobody can make a cheap cable or widget which kills people.


Is there not technology to check changes in resistance? Even if it is in a small area as it heats it resistance should change much more rapidly than resistance changes when the entire cable heats up.


Careful resistance monitoring could probably make 5 Amps sufficiently safe to avoid fires. But that's about the limit of it without the cable starting to get very bulky (due to needing thicker copper to keep heating down).

If you want a cable spec which is never the limiting factor for whatever new gizmo you're trying to design, and therefore want a high power limit, then it's really the voltage that needs to increase, since it can go a lot higher before engineering the insulation gets hard.


This should be the new standard for ebike charging, or all personal EVs like scooters and EUCs.


I wonder if this will mean that USB-C wall sockets will start to support more than 25W at a time so I can just plug a USB-C cable directly into the wall without the need for an adaptor.


USB standards group naming and branding continue to be the worst.


The main issue i have is manufactures of equipment to be charged, relying on the charger to limit the current being delivered.

This is appalling design, and is a receipe for disaster.


This will be great for the Small form factor PC space. If we can get mass produced after market 240W power supplies that live outside of that would be pretty nice.


I just found out that USB-powered coffee heaters are a thing.

As a general purpose power cord, USB-C is not a good choice.


Should it be more data oriented ... more power meant more danger ...


That's a LOT of current. At 5 volts that's 48 amps.


Next in playlist: Talking Heads - Burning Down The House


No. Laptops should all become more energy efficient.


I hope something will be done about proprietary chargers that go bad every few months - laptop stops recognising them as original.


I fear for cats everywhere.


yay global warming


Uh....

Melting USB-C connectors at 65W are already bad enough.

The problem is that there is no way to detect a bad contact, and they tend to be.

Few specs of dust, and you have 5 amps going to a single pin.

Even if you have split seconds momentary disconnects, you can get welds in contact pads, which will over time degrade the contact.

On other note, Intel may be increasing laptop CPU power budgets into 60W-70W territory to counter Ryzen people say. I think it makes sense now why they do it.


I used the charger + cable which came with my OnePlus 7 Pro to charge my Samsung Galaxy S8. The cable and port on the phone must have melted and solidified into one unit, because the next morning I couldn't unplug it. With more force the cable came out with the plug damaged and the USB-C male part in the phone ripped in half.

I don't think OnePlus makes incredibly high quality & safe chargers like Apple/Samsung, but they're not the cheapest Amazon garbage either.

This might be a rare issue, but it does happen. Combined with the mechanical degradation that USB-C ports go through (not as bad as micro-USB, but worse than full size USB-A - A does get loose but still makes good electrical contact), I specifically looked for wireless charging in my next device and try to avoid using the USB port as much as possible.


Is it really undetectable? A smarter USB PD controller and maybe some extra sensors should be able to mostly avoid that problem, no?


Most devices of meaningful value build in USB-C port protection parts specifically for this reason. Here's a popular one from NXP: https://www.nxp.com/products/power-management/load-switches/...


"smarter" an "should" are key words.

Not everybody buys the best hardware in class. Most hardware is cheap Chinese garbage for which the only qualification is that it isn't bad enough to be brought down from Amazon.com.

Go explain your grandma or girlfriend why the charger they bought damaged their laptop irreparably.


Ok, so you detect voltage drop of say 0.4V@5A on the phone end, so slightly out of spec. How do you decide whether it's a slightly underspecced cable (quite common) and 2W being radiated as heat from a 1m long cable (which is not really an issue), or great cable but high contact resistance (where 2W are concentrated into a tiny space that can't cool it)?


You could design a port with multiple contacts onto the same cable pin. One contact does voltage sensing while the other takes the current.

Then any amount of dirt in the connector can cause whatever heating it likes, but the device can always calculate how much heat is being dissipated in the connector.


Nice idea. :) Thanks.


The cheapest option I see is a fusible link from some very low melting point material.


No, unless you put a sense resistor on every power pin, and add circuitry to individually measure current per pin.


If the alternative is starting house fires that sounds pretty reasonable.


The problem isn't whether reputable manufacturers will do it, it's whether the bottom of the barrel cheap cables from eBay/Amazon will do it.

The advantage of USB2 is that it's very hard to screw up. The design is so simple that even the cheapest cable is usualy "okay" because making an "okay" USB2 cable is so simple.

In contrast, making a USB-C cable is much more difficult, which means unscrupulous manufacturers flood the market with bad cables that fail with disastrous side-effects.


The solution would be for devices to test cables before letting them work.

If my iPhone tested the cable was up to spec before charging and said "error, bad cable" if any test failed, then china-cables would be forced to pass all the tests.


If the pin turns bad during use, you can't do much


USB-C toaster when?

And since USB-C PD actually requires a micro controller we'll finally be able to run netBSD on unmodified toasters.


I've never heard a convincing (or even plausible) argument for why having identical connectors and cables with different capabilities is an advantage.


Laptops shouldn't ever need 240W. That's just getting stupid. Buy a desktop if this is what you need!


No thanks, I’ll keep my gaming laptop. Far more portable than a desktop, lots of compute overhead, and I can use it for games. At home it’s basically a desktop with external keyboard and monitors.

Oh yeah, and when it’s not being used with the dedicated graphics card, it gets 6-9 hours of battery life.

Oh, almost forgot - it’s lighter and about the same size as my 2012 13” MBP.


Does it require 240 watts?


I guess not, it says 180 on the back. Haven’t bothered to meter it under full load + charging though.

To be clear, I’m skeptical of pushing that much power over USBC.


I can't take a desktop with me on an airplane.


You can! Many years ago I took a desktop machine (minus monitor, keyboard, etc as my friends already had spares of these at the destination) on not one but two flights (going there there and the going back back). Surprisingly the people checking in my bag didn't seem to surprised--the only unusual thing was that they asked me to sign a declaration that the airline wouldn't be responsible for any damage. The bag tag went all around the machine (I didn't have the original box so I took a gamble to see if the airline would accept the machine itself) and off it went on the belt at departure. And on arrival it came on the carousel with all the other bags. It worked fine after both trips. All so I could join a gaming session with my friends in another city! Though it should be said I can see why it might be more tricky once you factor in a monitor and other things. And I probably wouldn't do it again now I have a laptop that can run some pretty decent games--but yes good memories... :)


A SFF would do nicely. A NUC even more so, but is not really up to high workloads


You can’t take a laptop with a battery larger than the 16” MacBook Pro on an airplane either! https://www.theverge.com/2019/11/13/20962380/apples-16-inch-...


Nobody sells laptops with batteries bigger than that! Guess why? Because you're not allowed to take them on planes. Just means gaming laptops have less battery life is all.


Most of these gaming laptops likely don’t; they’re really not intended to be used for heavy gaming while on battery, because the battery can’t support the discharge rate needed by a high end GPU.


For some people (like me), a laptop is merely a portable desktop.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: