Melting USB-C connectors at 65W are already bad enough.
The problem is that there is no way to detect a bad contact, and they tend to be.
Few specs of dust, and you have 5 amps going to a single pin.
Even if you have split seconds momentary disconnects, you can get welds in contact pads, which will over time degrade the contact.
On other note, Intel may be increasing laptop CPU power budgets into 60W-70W territory to counter Ryzen people say. I think it makes sense now why they do it.
I used the charger + cable which came with my OnePlus 7 Pro to charge my Samsung Galaxy S8. The cable and port on the phone must have melted and solidified into one unit, because the next morning I couldn't unplug it. With more force the cable came out with the plug damaged and the USB-C male part in the phone ripped in half.
I don't think OnePlus makes incredibly high quality & safe chargers like Apple/Samsung, but they're not the cheapest Amazon garbage either.
This might be a rare issue, but it does happen. Combined with the mechanical degradation that USB-C ports go through (not as bad as micro-USB, but worse than full size USB-A - A does get loose but still makes good electrical contact), I specifically looked for wireless charging in my next device and try to avoid using the USB port as much as possible.
Not everybody buys the best hardware in class. Most hardware is cheap Chinese garbage for which the only qualification is that it isn't bad enough to be brought down from Amazon.com.
Go explain your grandma or girlfriend why the charger they bought damaged their laptop irreparably.
Ok, so you detect voltage drop of say 0.4V@5A on the phone end, so slightly out of spec. How do you decide whether it's a slightly underspecced cable (quite common) and 2W being radiated as heat from a 1m long cable (which is not really an issue), or great cable but high contact resistance (where 2W are concentrated into a tiny space that can't cool it)?
You could design a port with multiple contacts onto the same cable pin. One contact does voltage sensing while the other takes the current.
Then any amount of dirt in the connector can cause whatever heating it likes, but the device can always calculate how much heat is being dissipated in the connector.
The problem isn't whether reputable manufacturers will do it, it's whether the bottom of the barrel cheap cables from eBay/Amazon will do it.
The advantage of USB2 is that it's very hard to screw up. The design is so simple that even the cheapest cable is usualy "okay" because making an "okay" USB2 cable is so simple.
In contrast, making a USB-C cable is much more difficult, which means unscrupulous manufacturers flood the market with bad cables that fail with disastrous side-effects.
The solution would be for devices to test cables before letting them work.
If my iPhone tested the cable was up to spec before charging and said "error, bad cable" if any test failed, then china-cables would be forced to pass all the tests.
Melting USB-C connectors at 65W are already bad enough.
The problem is that there is no way to detect a bad contact, and they tend to be.
Few specs of dust, and you have 5 amps going to a single pin.
Even if you have split seconds momentary disconnects, you can get welds in contact pads, which will over time degrade the contact.
On other note, Intel may be increasing laptop CPU power budgets into 60W-70W territory to counter Ryzen people say. I think it makes sense now why they do it.