I would say the major problem it had with adoption was that wired USB also provided power. (A lot more people use usb to charge their phone than to sync their phone.)
So great - wireless connectivity... but you still have to plug the device into a cable at some point (or have replaceable batteries), which makes the value proposition a lot less clear.
Beyond that it suffered from the usual adoption chicken-and-egg problem. Laptop manufacturers didn't want to add it because it was an expense that didn't drive sales since there weren't any must-have peripherals that used it, and peripheral manufacturers didn't want to make wireless usb devices since they couldn't be used with a standard laptop (at least not without a WUSB dongle - which raised the cost).
Still, very fun stuff to work on.
That may explain why I can't find a 5 Gbps wireless USB extender for my work web cam.
https://hardwarerecs.stackexchange.com/questions/18983/wirel...
Unfortunately, at the time there was only one phone that had wireless charging built in (Palm Pre). I know our sales and marketing did try to engage with them on getting wireless usb into "the next version" of the Pre, but nothing came of it. I don't know the details.
At this point wifi is ubiquitous enough that a new version of wireless usb would have a hard time competing with it though.
Bluetooth didn’t really hit mainstream until the arrival of chipsets that multiplexed Bluetooth and WiFi on the same radio+antenna. My memory is that happened sometime around 2007-2010.
At that point, the BOM cost to add Bluetooth to a laptop or smart device became essentially zero, why not include it? Modern smartphones with both Bluetooth and Wifi arrived at around the same time (I suspect these combo chipsets were originally developed for handheld devices, and laptops benefited)
And once Bluetooth was mainstream, we saw a steady rise in devices using Bluetooth.
WUSB operates on a completely different set of frequencies and technology and couldn’t share hardware with WiFi. Maybe it could have taken off if there was a killer app, but there never was.
Don't forget music piracy.
At least over here, a lot of kids had phones that did Bluetooth, and the primary use case for it was sharing songs they liked with each other. You could use infrared (IRDA) for that, and some people did before Bluetooth was common, but it was much slower.
This was mostly on low-end Nokias, maybe with a bit of Sony Ericsson thrown into the mix. They definitely did not have WiFi, in fact, Nokia even tried to limit internet over Bluetooth for usual carrier monopoly reasons as far as I'm aware, but Bluetooth was definitely there.
For many here, the iPhone not doing file and ringtone sharing over Bluetooth was one of its main limitations, at least early on. It was a social network in its own way, and having a device that couldn't participate in it was no fun.
The wireless headset was the killer app that drove bluetooth adoption within cellphones, driving down costs until eventually the lower-end models receiving it too. While sharing files was possible in the 1999-2005 era (especially with PDAs), most phones were lacking enough flash storage to store anything worthwhile.
While I don't want to say file sharing wasn't a killer app, it does seem to have been limited to just schools during a certain time period.
A time period that I missed out on by a few years. At high school, we did all our file sharing by swapping burned CDs. Then we switched to dragging around laptops and USB hard drives at university (and using the private emule network on the university wired ethernet).
Remember companies like jawbone?
I vaguely remember a cultural stereotype of bmw drivers driving aggressively and wearing Bluetooth headsets. [edit: this is the clip https://youtu.be/UqfAMvXpSw4?t=25 from top gear of jeremy clarkson wearing a bluetooth headset in sunglasses in a bmw, supposedly from topgear season 10, episode 10]
What you describe is file sharing, not necessarily piracy :-). Just nitpicking, I understand what you mean of course!
No BT stack in your product, no BT radio initialization, no BT/wifi multiplexing. At least in the (admittedly limited) chips I’ve worked with.
But the wireless headset is now a horrifying millstone making Bluetooth look like the world's stupidest trash fire. If you enable your microphone, you lose all audio from anything that doesn't want to use the microphone as the headset switches into "headset" mode and drops anything that wants to use "headphones" mode. There is no reason for there to even be two different modes.
Why is this still happening?
Implementing special requirements is always inconvenient for users because no B2C wants to risk bad the-microphone-didn’t-work reviews, customer returns, and support tickets.
Nevermind coordinating with arbitrary USB microphone latency…I’ve got one with 250ms of it.
I don't think you have any idea what you're saying. The scenario I'm describing is when you want to use a bluetooth headset that includes a microphone. Using a different microphone is how you solve the problem.
If you use Linux + KDE, you can still use any microphone or headphone, many at the same time, or in whatever mode you want.
It used to work on kde/plasma 5 at some point. And after a minor version update it stopped working.
Now the mic of my headset doesn't work because KDE insists that only the high quality sound output without mic is available. The mic + low quality output is gone from the settings.
Lucky for me this update also brought proper handling of the stereo positioned noise cancelling microphones on my thinkpad. So now I can actually enjoy the luxury of built-in microphones that work. Until the day it wont I guess.
It maybe could have worked with better marketing, but convincing potential customers to change something that works (somewhat, BT wasn't without issues) is hard. That's why we are keeping abominations like cigarette lighter sockets in cars even though they often can't even light cigarettes anymore. It is already well established and it works well enough as a power outlet.
IR was exceptionally slow, required line-of-sight and even at the time, felt like a shitty solution. So even though the early implementations of Bluetooth also left a lot to be desired (battery hungry, insecure, and also slow), it was still a massive improvement on what came before.
Wireless USB wasn’t a significant enough improvement to Bluetooth given that BT was already ubiquitous by that point, but also cheap and (by that point) battery efficient now too.
Back when BT was new, I used to get all sorts of random shit pushed onto my phone every Friday night on the drunk train home from London.
Some devices would even establish an IRDA connection automatically as soon as they found anything. I have friends whose laptop names have suddenly appeared on lecture room projectors, as their laptop's IRDA receiver was in direct line of sight of that of the teacher's.
Not that you couldn't do that with Bluetooth, some early BT chipsets gave you a "<device name> wants to connect to you" dialog box any time somebody tried sending something to your device. This could be abused, to great student amusement, to display funny messages on that same projector if the lecturer's laptop had such a chipset.
You could probably solve those issues with modern tech though. Things have advanced significantly since IR was popular. For example, back then Bluetooth was slow too.
The problem with Irda is that it's old. Technology has significantly advanced since the 90's, when Irda was popular on cellphones, so a modern implementation could do better data rates even accounting for the significant interference from the environment. We barely had wifi back then, and now it'll do a few hundred megabytes per second without breaking a sweat (your ISP might though). All the technology required to do that didn't exist in the 90's. We have Bluetooth now though, so there's that same bootstrapping problem, where you'd just use Bluetooth, and not spend a bunch of money building a system very few people are asking for, so then there's little demand for a modern high performance Irda system in any devices.
The only really clunky use case for me was internet access - keeping phone and laptop positioned and aligned for 30 minutes was limiting.
And yes there IS plenty of bandwidth at those frequencies. In fact latest IR standards reach 1Gbps, but it's pretty much extinct. There was an attempt called Li-Fi to use it for as a wireless networking but I don't think it went far.
What I really miss is OBEX (Object Exchange), which worked also over Bluetooth, and which Apple sadly chose not to implement: simplest protocol to just ship a file or a contact vCard over, no setup, just worked - and it's been a standard for 20+ years. Early Android had it too, it was since dropped I think. Sigh.
In the days before Bluetooth, transferring MP3s over IR took multiple minutes, even on high end (for the time) handsets.
And the fact that you needed to keep line of sight during the whole process meant your phone couldn’t be used that whole time. Which was a real pain in the arse if you got a text message or phone call while trying to transfer a file.
IR was really more designed for swapping contacts. In fact that’s exactly how BlackBerry (or was it Palm?) marketed IR on their device: a convenient way to swap contact details. But you’re talking about a few KB vs several MBs for an MP3.
The tech has definitely moved on since. But then so has Bluetooth, WiFi and GSM et al too.
You have 2 mainstream protocols now, one for low energy, slow data transfers (Bluetooth) and one for fast, but more power hungry devices.
I don't see the usecase for UWB.
In my opinion, this was the timing and usefulness of Bluetooth in an era when only Nokia ruled the world. Moreover, there are many other reasons too.
Wirelessly transferring files between a phone and a computer seems like a big use case. Still no easy standard way of doing it.
They might want to transfer (a better word: share) photos/videos, documents, etc. And for those they use specific apps and "the cloud". No "files" (for the sake of files), and barely any hierarchy of (folders etc).
As long as the entity they want to share magically shows up on the another device or at the other person they want to share with, they are happy. They just skip two levels of abstraction ("this photo is a FILE and I will use USB to transfer it"). Maybe a far fetched analogy but this is why most of the drivers of an automatic don't really think about clutches and how the torque of the engine's output is converted.
At least this is my perception (outside the IT bubble)
Sure, I may be in a photo gallery and I may want to share a few photos with a friend who may want those photos to be treated as photos (instead of going into a big "Downloads/" folder). But it doesn't mean, at all, that the concept of file has to disappear to the user. In fact the files still very much do exist on the system. Product people just assume users are stupid, IMHO.
And the thing is: this abstraction (not knowing what a file is) doesn't make it faster or more efficient. It just makes the user more dependent on their platform and apps. Look at backups: product people at Google/Apple will tell you "people don't want to backup their files, they want to pay us to make sure that they never lose an image". Conveniently, it means that people are 1) forced to pay them and 2) don't have control over their own files.
Maybe GenZ/alpha now are stuck with these abstractions because they never learned what a file was (for no reason other than being abused by product decisions), but older generations grew up with physical media. "I have a piece of paper, I have a book, I have a CD-ROM, and those are all different kinds of files that can go into different "boxes" that are called folders".
Files and folders are very natural. The reason people don't know about them is because we hide them and force them to pay for literally subpar experience.
I most frequently use the latter, directory when I am talking files and filesystems.
Most people return that with "folder", and I am sure that has to with my learning about these things happening where "directory" was the norm.
I have been educating people about files when I bump into ones that do not know much. The abuses are real and growing. Nice comment.
It's quite clear what you never had to explain why 'only looking at a pictures/photos on the Internet' wasted the mobile traffic.
Or is it rather that you consider yourself one of the few people smart enough to memorise it? I find that very condescending.
I don't believe that one needs 3 postdocs to understand it. In fact, I do happen to have explained it quite a few times, and I don't remember anyone not understanding it.
Because we can't transfer easily transfer files between devices remotely, we had to get used to do it via apps. And so we didn't developed good, local files browsers (esp. for media) and companies invested in the cloud UI mostly because they could sell the storage and sharing capabilities. That was all unnecessary but we're used to that now to a point where sharing files is weird.
As a power user happily syncthinging all my files between all my devices, I'm sad because files is the easiest thing to share, organize, transfer, etc. I wish iOS supported this kind apps (full storage access!) as we could avoid the many, crazy, Alps specific workarounds just to share some stupid files.
And don't confuse the file itself (say, a pirated movie), the metadata (IMDb IDs) and the apps UI (Kodi!). Files is what we have, we should share files and let anyone pick the browser/apl they like for viewing, organizing…)
On the other hand, I don't mind that full storage access is a "pain"; I don't even remember which apps I gave the permission to, and I would certainly be angry if my syncthinged files would be stolen by other app that went vicious.
All that said, as people don't think about their documents/photos/any other stuff in their homes as "filed items in folders", non-tech people also don't think about their digital items as such. And maybe this is alright, if the "file-ification" would have been so successful, better products would have emerged.
One of my great hate pet peeves with all smartphone and cloud apps is the "abstraction" and reliance on search. For me folders is quicker and less error prone, and as a bonus it saves on unneeded bandwidth (to load previews) and computing costs.
Also stop telling me I must use your one off "feature set" of sorting and ordering which either nobody uses or copies differently. The amount of square wheels (for me I must add, ymmv) reinvented is astonishing.
Machine Learning is making this better, but ideally albums or folders wouldn't be such a pain in the ass to actually use in day-to-day life.
If your music is stored in a folder hierarchy, and can, in principle, be located anywhere, how do you index it to provide a library view? How do you distinguish it from random audio files that just happen to be ID3 tagged, but which you don't want as part of your permanent music collection? How do you efficiently react to deletion events? What happens if you delete an entire artist's worth of music from your music app? Should it delete the files, or only the library entries? If it deletes files, what if (some of) that music was in a folder that didn't contain any other files? Should that folder be gone too, or should you be left with an empty folder or hierarchy? What if the folder also contained a .nfo, is it good UX if it deletes the music and just leaves the .nfo?
If the only tool you have is a computer, everything is a file. If you're a music lover and not a computer enthusiast, you tend to think about albums, artists and playlists, and that's how you want to view your music collection.
What about playlists?
The limitation of the folder is that there’s only one.
I said it needs a place to write playlists (or write access to your playlist folder(s)).
I wouldn't do it this way, but there can be more than one folder containing the same file (hardlinks).
People don't really internalize that those are two different use cases.
Yes there's Airdrop, but I think most people view it as more of a "discoverability" solution than a file sharing solution. If you met somebody you don't have a number for, "okay just Airdrop this to me" is much easier than doing the whole song and dance of adding them to contacts and sending them an iMessage or finding them on Whats App. Whether the actual file transfer part of Airdrop goes over the internet or over Bluetooth isn't something most people care about, as long as it can discover nearby devices and initiate a transfer to them, it's good enough.
Everybody, and I mean everybody is capable to understand that to connect their Bluetooth headset to their phone, they do it over Bluetooth. And that to connect to the Internet, they can either go over WiFi (which is "free") or cellular (which is less "free").
> People don't really internalize that those are two different use cases.
We actively keep them ignorant, and then we use their ignorance as a justification. I find it sad.
What if we said "People don't want to drive their car somewhere, they want to go from A to B. We should prevent them from learning how to drive so that they would have to pay for our taxis".
For example the file API does not allow a clean, uniform, and reliable way to associate a resource with some metadata
Seems to me we very rapidly arrive at records or entities.
We see both these days in databases.
Entities show up in CAD and simulation. Records show up in business tools of various kinds.
All require a schema and serious dependencies flow from there.
In CAD, for example, the database schema can change quite dramatically from version to version of the same software tool. And all this makes writing plug in tools or anything really painful.
And forget exchanging native data between systems. STEP exists for that, and O God help you on a bigger project involving any old data
The thing about files is they are basically EASY.
And easy, when looking at where we are going, matters. A lot.
Files can exist on pretty much anything. Paper tape, mag tape, all sorts of media, up to advanced storage tech.
Databases are a different story.
I am not convinced we are anywhere rear being ready for that huge leap.
And I would normally say "forward" but on this?
Nope!
It would be a huge mess requiring we toss just about everything we have in use today
If not, then you're not abandoning the file layer at all. You're just preventing people from benefitting from it.
Windows 11 still supports it, I think macOS too. Pairing is technically optional.
I had to pair, or at least I think I did. Was fetching a file. off a flip phone. Doing these things without "file" gets weird quick.
I also seem to recall an awesome Bluetooth control panel applet I used a few times in Windows XP.
At the time I had a pretty spiffy Moto flip phone. It could be the computer keyboard, handle audio play and record and more.
Pretty sure that all came from the phone driver. I do recall also using the computer in reverse the same way when the display was badly damaged. I could make phone calls, dialing with the computer and in general use the phone taped to the back of my laptop screen.
Today, it is simpler, and far less robust.
It "solves" it but in a way that's ten times slower and fundamentally unreliable.
It's not such a big thing, though. I hardly use it, and young people don't seem to use it either. The stuff on their phone and laptop seem separate worlds, just like mine are. Might be because they don't know about it, though.
If I want to share something with someone else, there's a "File Sharing" section in phone's settings that enables anonymous WebDAV sharing, and it works fine too. There's Bluetooth OBEX too, but that one's fiddly.
TFA mentions that contemporary users of these things didn't get anywhere near Hi-Speed USB speeds. The author's present-day testing agrees with these reports, finding that at least one device's maximum performance was just barely better than USB FullSpeed.
If you were seeing 480Mbit performance with the hardware you were doing demos with, what went wrong between the demo table and the finished product?
Just like early wifi, there were several companies working on wireless usb chips at the time, and performance could vary a lot depending on who's product you bought, and when.
Here's an article about us I found from 2008.
https://www.eetimes.com/staccato-communications-ultra-wideba...
The "ripcord 2" chips (mentioned) definitely could do 480Mbps at short range. I worked on the design of the next generation after that (equal performance, but lower-cost/lower-power-consumption), which never made it to the commercial market.
"What happened" was the combination of the product/ market mismatch I mentioned above (like, the wireless laptop dock was cool for a demo, but it didn't charge your laptop battery like a regular wired dock would, so it wasn't actually practical for daily use) so we didn't have enough revenue to self- sustain, and the "great recession" meant investment dried up and we eventually just ran out of money.
Staccato merged with a different wireless usb startup to try to delay the inevitable, and then tried to "pivot" to something profoundly stupid and I bailed at that point. (They did an internal demo of the new "product". It was maybe the worst tech demo I've ever seen. I was out 2 weeks later. I think the company dragged on for maybe another year.)
It does seem to be missing a pretty significant era though? There's 802.11ad (2011) / 802.11ay (2021) / wigig.
It's mainly known for video, and is used today for VR headsets. But there's a huge variety of 802.11ad docks out there that also have USB, mostly about a decade old now! Intel's tri-band 17265 (2015) was semi popular in the day as the supporting wifi+wigig+bt host adapter, works with many of these docks. https://www.intel.com/content/www/us/en/products/sku/86451/i...
I've definitely considered buying a dock & wigig mpcie card & test driving this all! Price was way out of reach for me at the time, and I expect the performance caveats (range, speed, latency) are significant, but it could potentially genuinely help me run less cables around the office & the patio, and that would be cool. Afaik though there's no Linux support though, so I haven't tried.
Not UWB focused (but could work over IP capable UWB systems) I'd love to see more usb-ip systems emerge. It works pretty well for DIY (and kind of has for multiple decades now), but productization & standardization of flows feels hopeless, & worse, feels like anyone who knows up is likely to do the wrong thing & make something proprietary or with nasty hooks. https://usbip.sourceforge.net
And not USB specific, but pretty cool that the briefly mentioned 802.15.4 group continues to have some neat & ongoingly advancing 6-9GHz UWB work. IEEE 802.15.4ab is expected semi soon. Spark Microsystems for example recently announced an incredibly low power SR1120 transciever, good for up to 40mbps, capable of very low latency. It'd be lovely to see this used somehow for generic/universal peripheral interconnect. https://www.hackster.io/news/spark-microsystems-unveils-its-...
I.e. the effort was driven by the USB-IF [2] that happens to be more hardware than software oriented. So they were eager to deliver a solution based around a new chipset that could be adopted immediately by anyone interested.
This failed to account for adoption friction/lag, and the era of ARM-based SBCs and WiFi proliferation which was already dawning (e.g. iPAQ handhelds were available at the time [3]).
So, they ended up with most of their envisioned use-cases [4] being covered either by SBCs, or by Bluetooth. At least in retrospect, standarizing a pure software solution like USB over IP, as an added-value proposition for the USB standard, would have made more sense.
[1] https://en.wikipedia.org/wiki/Law_of_the_instrument#Abraham_...
Unexpectedly, battery time was never an issue. The WUSB chip in the receiver would overheat long before that and start throttling, leading to jittery head tracking.
Turned out, it was a widespread issue with that WUSB chip.
But that just bought us a bit more run time without actually solving anything.
From what I’d gathered at the time, it was a common issue with products that relied on that specific chip, and I doubt most shared our use case.
Wouldn’t a fan in a backpack just move hot air including the heat of its motor?
A quick Google says that the Oculus DK1 used ~3W, and you can easily find a fan that uses a fraction of a watt to move a reasonable amount of air, so this would probably have worked out.
A backpack is pretty much a closed system and chips use convective cooling.
Adding a fan won’t create a positivee pressure gradient between the backpack and outside world but will add 3 or more watts of heat to the closed system.
I'll make this very simple: The hot chip is warmer than the ambient air because the rate of heat transfer from the chip to the air is low. A fan will increase the rate of heat transfer, thus decreasing the temperature of the chip and increasing the temperature of the air in the backpack. It will also increase the rate of heat transfer from the backpack air to the backpack, which will increase the rate of heat transfer from the backpack to the environment.
Notably, the fan would help even if the backpack was a magic closed system (which it is not; put a 100W computer and a 1kWh battery into it, open ten hours later, and you will not have anywhere near 1kWh of heat.) But why would it help in a closed system? Because the chip does not care about the total energy in the system, the chip cares about the peak chip temperature. The chip will always be the hottest thing in the backpack, but the delta in temperature between the chip and the air can be quite large. Indeed, in practice, for "natural convection" (no fan), this dT between the chip and the air is considerable. When you add a fan ("forced convection") you shrink that dT substantially.
Whatever, it is easier to see for me now. Lol
Seeing the benefit of the fan in terms of increased heat transfer to everything the air touches is easy. Full stop.
They fixed it now.
I mean a wireless USB hub would eliminate exactly one cable [1] and onboard wireless USB requires the same number of radios as WiFi. [2] But “Wireless USB” still sounds a kinda’ sexy answer to “What are you working on?” [3]
[1] Wirelessly eliminating one USB cable already had its critical solution in a mature dongle dependent wireless mouse market.
[2] For example WiFi printers were already a thing and fit into the evergreen problem of sharing printers and wireless USB wasn’t going to improve online experience.
[3] “Wireless USB” is a great sound bite. Short, sounds like the future, and people will feel like they know what it means. [4]
[4] The article reminded me that indeed at some point in the last five years (or maybe ten, these things run together) I thought “wireless USB would do that” and googling “wireless usb” because surely it must exist but of course it didn’t really and I probably bought a long cable off eBay. But I remember coming up with the thought and googling.
Btw, is there a direct comparison anywhere regarding energy consumption of the competing standards in real situations?
I don't know how Logi Bolt works, but Logitech has claimed that it should work better than BLE when the 2.4 GHz band is congested. Also that it would have better security than BLE.
Doesn't the same problem exist for USB dongles with proprietary RF protocols?
Logi Bolt is a good solution. But ime most other USB dongles are terrible. I have had a lot of bad connection issues with such USB dongles, and never with similar bluetooth devices. USB dongles also use the same 2.4GHz band, and even more they are prone to interference from nearby active USB ports [0]. If you have ever had a "jumping" mouse while transfering big amounts of data through a port neighbouring your mouse's USB dongle, this is likely the reason.
I'd think it would also be possible to get around congestion problems by using tricks such as multiple channels and/or interference detection on top of BLE. But only Logitech knows how Bolt actually works.
They mean the mouse communicates an absolute position (relative to some arbitrary 0,0 the mouse decides upon) instead of a relative direction.
Dongle can then take latest coord packet and diff it against previous coord packet to get a relative coord to pass via HID to the system.
If the RF packets are lost, some latency occurs but the dongle still has the previous mouse coord and can make a fairly accurate correction once a packet gets thru (get's from A to D, but might skip points B+C).
I am not sure which dongles make these corrections, but my experience with dongles is worse than bluetooth. Typically, a mouse is very close to the bluetooth antenna of a computer, and I have not really experienced any sort of connection issues due to missing packages etc. In contrast, I have had tons of issues with usb dongles due to usb interference.
It could send a "reset 0,0" packet of some form in this case, but now reception of that packet becomes critical to continuing to properly communicate motion to the attached computer.
And those "how I would have designed a wireless mouse protocol" guys are back at the square one.
It's just that they control both sides of the signal so can better optimize the connection.
The fact that Logitech’s current dongles are just BLE with a fancy encryption scheme tends to indicate that they really want their proprietary hardware, and bandwidth is not the reason.
Dongles are also plug and play (no pairing dance) and more readily support multiple devices on the same computer.
Bluetooth has gotten better over the years but it doesn’t provide a meaningfully better alternative for the it-aint-broke consumer mouse market.
It is true though that USB interference for wireless dongles is an annoying reality. My Logitech Unifying dongle has issues whenever I copy files over USB. I'm not sure if later revisions or their Bolt dongles have improved on that.
In theory, Bluetooth ought to be the replacement for most use cases, and would simply require replacing your USB devices with Bluetooth devices. In practice, Bluetooth is still kind of terrible, so I'm tempted to say any alternative timeline where something else won the personal area network war would probably be better.
We still kind of do wireless USB, in that the standard for wireless mouse and keyboards is still not Bluetooth, but a dedicated USB dongle that ships with the device. Such options are available for wireless headsets as well, although Bluetooth seems to winning in that niche.
Btw, do you have any other suspected reason (politics aside) that wireless USB did not catch on?
Bluetooth is a nightmare of a standard. Up until very recently even pairing two devices was a non-deterministic operation. Apple went as far as creating their own chip with their own protocol for their headphones just not to have to deal with bluetooth.
Sure, your Bluetooth headphones only 1:1 connect to your phone... But if they could connect directly to your WiFi router they could keep playing music when your phone goes out of range... Or you could connect them to two phones... Or you could connect them to your TV to get sound from that...
Basically, IP networking still allows direct connections, but also allows far more possibilities.
Same with wireless USB - a wireless USB printer can only print from one host - but a wireless IP printer can be on the network for all to use.
>> attack surface, and give manufacturers access to way more information than I am comfortable with
When your device is on your WiFi you cannot be completely sure what it does (unless you monitor the traffic).
I've got a cat named Emacs, but he's not allowed to be a root password.
No firewalls to worry about, no external access, nothing, just all my devices automatically communicating with all other devices.
If any of my colleagues would make an overly abstracted solution for a problem and ship it with a dsl to configure it, I would say no, and ask them to solve the problem at hand.
I agree though that existing WiFi networks are hard to connect to from devices where battery life needs to be measured in months.
It's so terribly slow it's almost unusable, but does seem to be substantiality more power efficient than running a WiFi hotspot all the time.
[1] But certainly not best. Consensus for "best" goes to the open source ExpressLRS work based on the Semtech LoRa products.
https://arstechnica.com/gadgets/2024/02/hdmi-forum-to-amd-no...
Basically you "just" put your laptop on your desk and it automatically starts getting power (similar to what phones can do nowadays) as well transmit video to a display (on the same desk).
It's sad that went nowhere, it would have been very cool and something actually useful.
Less efficient, just for "cool". I think it's better to stick with cables.
It would be a marginal. Improvement at a huge increase in complexity
Adjacent intention on the same action that leads to a connected computer, eg. I put my laptop on my friend table for storage, and it connects against my intention.
Connecting over a cable is trivial: you detect the connection and that's it, and the user physically sees the connection between the devices.
Connecting over radio requires pairing, that is very frustrating when it doesn't work. Pairing is annoying so devices try to automatically reconnect, but then if you pair with multiple devices, it brings frustration because it never automatically connects to what the user wants.
Whenever cables are a possible solution, they are superior.
Everything's frustrating when it doesn't work.
> Connecting over radio requires pairing
This is a solved problem, plagued by technology fragmentation. You could very well save the necessary information for discovery and pairing onto an NFC tag and use that to access the network (further authentication might happen, if configured).
This is basically never done on WiFi because you cannot assume a client host has a NFC reader (let alone proper code handling the tag and the information).
But it's done in the world of bluetooth: some big-name headsets (Sony IIRC) can do bluetooth connection negotiation via NFC. You activate the feature (dedicated button), tap your phone and off you go. No pin, no pairing annoyances.
My point being that it cannot be frustrating if it doesn't exist.
> This is a solved problem
Tell that to my airpods that connect to the wrong device (so I need to manually go connect to them most of the time) and to my phone that doesn't automatically connect to my airpods when it is already connected to my watch.
Granted, I want wireless between my phone, watch and headset. But I wouldn't call it a solved problem.