I remember back when USB-C was becoming a _real_ thing, with the EU getting fully onboard and setting dates for full compliance being required for every device manufacturer selling things there. This eventually forced Apple, albeit a few years earlier than required, to switch from the venerable Lightning connector to USB-C, which they had in fact been a part of developing.
In those days when the connector was not merely being used for data, but relied on now for power, a problem became apparent when cables sold online started being overwhelmed with current and failing, sometimes with significant consequences. This wasn’t like the Samsung Galaxy Note, where batteries were at fault, but poorly-engineered cables with insufficient gauge wires and insulation.
USB-A 2.0 connectors are pretty simple, with 4 contacts to do power, ground, and serial data. It had its limits for many things, both speed and power, because clocking higher can only get you so much more speed over copper and without shielding. Adding more connectors is the best way to do both things, and while Lightning could have done this more effectively, USB-C was set to push the envelope of what the most common connector in the world could do.
Do you remember the funny looking USB connector on some older portable hard drives and phones? Samsung used this on a few of the Galaxy devices and I pity the users who had to keep one of these ports clean and cables from being damaged, as it’s the most awkward one I can think of, more so than the USB-B port which you’re most familiar with on printers.
USB-C to USB 3.0 Micro-B cable
As you can see, that’s just adding more wires to the mix in a side car, but it was a real thing that had it’s use for a few years before bigger, better cables came along. More likely it was a manufacturing problem first, which is why you don’t hear about USB-C cables being an issue anymore. With the proliferation of the connector, better communication and negotiation of data and power became widespread. A more basic USB-C cable like an Apple 2m charging cord, is only USB 3.0 at best, but features more lanes for power. How do I know this? Let me show you.
All of these connectors are for power, except for D+ and D-, which are the basic USB data lines.
This neat board has USB connectors on two sides, with USB 2.0, 3.0 A ports and USB-C on the left, and a USB-B, -C, Micro-B, Micro, and Lightning on the right. It can also tell if there’s shielding present, which the Apple USB-C charging cable definitely has. I’ve not read deeply into the manual, but there’s also an ID LED, which may indicate some extra intelligence in a cable.
This neat little board is pretty cheap, runs from a CR2032, has a power switch, optional DC power in, and can give you a pretty good idea of how good, or bad, your cables are. Want to know if an A-to-C or A-to-micro is power only? Easy!
USB-A to -C power cable is only using the VBUS and GND wires, so it has no data
Power-only -C cables are very common, which is good, but can also be frustrating for those used to the -C connector being useful for power and data. The number of times that you’ll grab a short -C cable and plug it in to a device, to then have it charge but not mount, will be greater than zero. It’s pretty easy to test, and I think that my next tip will also be useful for those who hold a cable to examine what it might be able to do.
Those lights mean that there are at least 4, and maybe up to 8 actual wires present. This represents actual physical copper inside. If you think back to the Apple cable, then you begin to understand how many more wires are present. Even in that photo there are many, many LEDs not lit, but that’s not to say that you don’t own a cable that can light up most, if not all of them.
Do you have a USB 3.1 gen2 NVME SSD? Does it have one of those short cables that’s really, really stiff? There’s a reason why it’s 1) short, and 2) stiff. The first is that high clocks mean that the errors over a distance are more likely, so fast cables tend to not only be short for aesthetic reasons, but cost and complexity and reliability all factor in. This cable is from a RAVPOWER 1TB NVME SSD and it almost gets the full tree. It’s short, less than 12″, and really resists being bent.
USB-C cable included with a USB 3.1 NVME SSD
It was tricky to get the cable to bend all the way from one side of the board to the other, but I was able to make it work. It’s shielded, and using all of those other side channels in the RX- and TX- but not both of the D- D+ and the other CC line.
You can find this little widget on most any online marketplace or go to https://treedix.com and order direct. I bought this one through an eBay seller, and did not go for the optional polycarbonate case.
I have a few m.2 SATA Samsung drives that I needed to securely erase. It’s a tricky thing to do, surprisingly, especially when you’d prefer to not do a long pass in the manner that you would with hard drives.
For those unaware, Samsung has a built-in tool with the Magician software to create a USB boot disk to securely erase an SSD. With a 2.5” drive, where the SATA and power are separate, dealing with a “frozen” drive involves starting the utility, selecting the drive, unplugging the power lead from the drive, powering it again, and then proceeding with the quick, secure erase.
When the drive has an edge connector like an m.2 slot, this isn’t possible. In the course of trying to replicate this process I tried many things, including other computers. None of this effort was successful, however, and I needed a hybrid approach.
It occurred to me that someone must make an adapter that would make it possible to connect an m.2 SATA drive to a standard SATA port. There are reasons for doing this, though I’d have a hard time coming up with many, but I’m glad that such a thing exists. It’s cheap, and worth having around for not only tasks like this, but as a neat tool to keep on the shelf just in case.
A brand called ELUTENG makes this:
It’s SATA III and can adapt to all four of the m.2 module lengths, from 30- to 80-mm. The drive slides right in and the included hardware and screwdriver can fix a drive in place if wanted.
So, I picked one up, and it does the trick just fine, making an m.2 into a 2.5″ analogue and letting me securely erase these drives using the Samsung utility. It’s brutally utilitarian, and that’s exactly what the task needed. It might also be useful for use in an SATA-only system at some point, and with m.2 SATA ports being a little more unusual, that might be a thing in the future.
For those curious, this was using an older NUC, where I was able to disconnect the power from the motherboard using the cable connector below:
For the full details and part number for this item, which I did not receive for free, see below
I was a vehement believer in the “Internet of Shit” as devices with wireless capabilities were sweeping into the home. It seemed like, and is, a bad idea to just add whatever crap device to your wireless network in the hopes that it will do a trivial thing. This goes from lightbulbs to water shutoff valves, and everything in-between.
That changed a few years ago, and I’ve come around to using _some_ IoT devices in the home. The current extent of this is lights and power switches. Nothing crazy. What I’d like to go over is what, how, why, and who. The start was simple, and then sped up, then got fun. It went from one vendor to four, but all work within one system, which is the most important part of the whole arrangement. I chose to have one master system that would control everything, and it is not HomeAssistant.
I stand by the statement that Apple’s stance on device security, and security in general, is a result of their not being able to do literally anything on the web. This was an important pivot, and it’s why I eventually came to respect Apple for a few reasons. There are issues with the company’s polices, but they hold the line in most cases, and bend to good policy when they’re wrong. It was only when Apple really got their HomeKit system solidly in place did I figure I’d try some home automation and devices.
The biggest reason I went with Home. is that the core principle of the system is that it can work without talking outside of the network. No secondary apps should be necessary to add and use the core functionality of a device. A lightbulb, wall switch, camera, or any other device can be added and configured without downloading an app from the manufacturer. This requirement means that fewer Apple Home devices are out there than Alexa or Google, but it also means that when those device manufacturers abandon them, or disappear completely, the devices are still useful.
Because of this standalone model, you must have a device that manages everything. Until recently, it was possible to do this with one of three device types. An iPhone can not be a hub, but an iPad, HomePod, or AppleTV 3rd generation or newer can be. I had an iPad available, so I used it for the task to figure out whether or not IoT was something I wanted to keep using.
I started with some WeMo smart plugs. They are cheap and easy to get, and are frequently found used for half their normal retail cost of $30. An update to their firmware meant that they could be used with Apple Home, if they were an early version, but with some complications that I may get in to. These did require some fiddling, and getting the WeMo app, but that wasn’t something I felt uncomfortable doing as the brand is one of Belkin’s, and they’re known very well in the industry for making good products and being forward thinking.
Once I went through the laborious process of adding the switches to Home, which was complicated because these older WeMo devices didn’t have the QR code on them, I got to making automations that would do simple things, like turn on two lamps in the living room at 45 minutes before sunset and turn off at night. It was something that Apple had baked into the Home app, and it was just a few changes and selections that made it work. This was all fun and good, and I eventually aded 5 of these switches, plus one bonus from another brand, which are still present in the house doing normal things, but interestingly they don’t need to turn on lights anymore.
I don’t remember when I got the first HomePod mini, but I figured that the newish device would be a good replacement for using the iPad, and I wanted to add a speaker to the home for simple functionality. I added it to the kitchen area, then added another to the bedroom, and a third to a basement area. I found it fun to cluster these together and play music throughout the house, but it’s a feature that a person will likely do once and never again…but they can. HomePod being a hub is a good thing, because the devices get OS updates and features get added behind the scenes, all managed in the Home app’s settings.
Lights may have come next, when I looked and found some LIFX bulbs to try. Home compatibility was a must, and they came with little cards to scan to easily add them to Home. I put these in the bedroom area to free up one of the switches and then realized that I could ask Siri to change the brightness of the bulbs. It honestly hadn’t occurred to me previously because the lights were controlled by switches, not the bulb itself. I changed a few scenes so that when the bedroom lights went on they weren’t at full brightness, but perhaps %60-70.
A big change came when a local decided that they’d had enough of their Hue system. A hub and what ended up being over 20 bulbs was mine for a price that I could not turn down. This did introduce another app to the mix, but I was again confident that Philips, one of the largest electronics manufacturers in the world, would do me right. The difference between the LIFX bulbs and the Hue bulbs is that Philips uses a hub, which connects to the network, and is a bridge between the two systems. The core change is that bulbs communicate over a different network, not using the normal wireless channels, meaning that network congestion isn’t an issue. The person I got the system from found out very quickly that 20+ bulbs using their home wireless was a problem. But it wasn’t mine.
The Hue app is pretty good, and I was able to very quickly and easily set up rooms and get them added to Home. This changed the game throughout the house, from the porch to a closet under the stairs. Sure, I now leave all of these light switches on and there’s more power usage, but the Hue bulbs use a minuscule amount of power per light. These bulbs were only the normal sized, normal temperature, with no colors or any other fancy features. On, off, and level. That was all easy to configure and set up first in the Hue app, but then fine tune in the Home app. Once again I would set up things in an app, but then could leave it closed for months at a time.
Later on I decided to add some first-generation HomePods to the system, for their use simply as audio in the living room. They were supremely over built in many ways, and very serviceable if they failed, but also very easy to find used at reasonable prices. Amusingly the first two that I was able to get were the white, then black, so I have a salt and pepper set. I kept meaning to get another white one, but I’ve kinda settled into the set I have. I was able to easily pair them together for AirPlay from phones, tablets, and computers. It’s something that I use many times a week, and sometimes per day.
With one of the HomePods I got an iDevices switch with an integrated light, which has been kind of fun to use. It also helped to get its app, as the very early ones like WeMo didn’t have the QR code for setup. I now use it as part of the hifi system for my amplifier, and it does a great job.
While at a local reseller of items I got an LED strip, which was really very fun to set up and use. Two meters of RGB lights in Home is pretty simple, and I know it can do more fun things in the manufacturer’s app, but I really only need it’s more basic functions. Colors and brightness are the core uses for me, but I did have some more fun when another batch of devices came along later.
A very good friend was moving from a Philips system to other pastures and offered me a very valuable trove of items to spruce up the home. Four RGB bulbs, two of the ambient bulbs, two wall switches, a multi switch, and another LED strip were shipped to me. I had a bit of thought and put these in the most used placed, where I’m at either daily or weekly, and where a different color or tone or hue would be a good thing. The living room once again got its bulbs switched out, and now it can be very interesting in there. Same with the bedroom, and the studio. That’s where the second LED strip went, and it makes a fun accent to the vibe I’ve created.
One of the reasons for using the Hue app outside of the Home experience is that it offers several dynamic scene options for RGB. For example, I have the LED strip in the studio set to emulate a fireplace, so it’s very red, with yellow and orange flickering. I can also have a similar effect with the other RGB or ambient, temperature-changing bulbs. It really did add spice to some rooms.
So, what’s next? I’m not sure. I’ve only just recently mounted those two wall switches. They have on, off, and two dimming buttons. We decided on locations to put them and this has brought down the calls to Siri a bit, but that’s fine. Sometimes a manual switch is just easier and more reliable than a speaker either hearing or comprehending what you’re asking. It seems like it’s come around full circle, but sometimes that’s the process.
At this point I’m satisfied with what we have set up. I think that more accent lights, such as the ones from Nanoleaf, would be fun to add in the office space. I’ve avoided things like in-wall switches because this house has very thick copper wiring and my delves into the walls previously have led to a lot of sore fingers and cursing. Add to that the complete mystery of how the breaker box is zoned, and, well, remotes it is.
There are few things that I’d change, but the biggest is that Home was able to incorporate some of the more niche features of something like the light strip and bulb’s RGB color scenes, like the fireplace. I understand why they don’t, because those are vendor-specific, but I’m hoping that one day Home will add such things. We’ll see what happens.
This isn’t going to be a purely tech-focused entry, but the first part of what’s likely to be a series. It’s more of an out-loud guide for myself and possibly for the benefit of others.
I’m one of very few people I know of whose spouse and I both had living parents, still married, at the start of 2024. The other is a sibling, or was in this case, as 2024 was a tough year for parents in the family. It’s not that we didn’t see this coming, because it is mortality in the simplest terms, but two of these losses were brutal. Needless to say that, in early 2025, that first statement can no longer be said.
For my brother’s family I was able to rescue data from a mobile phone and PC, then transfer multiple Video 8 tapes to a digital video format and share them more broadly. This is something that I enjoy doing, because a tape is a great place for a backup, not the original copy. Those videos and photos and files are shared and accessible to the family, while the digital devices have been sanitized and moved on to new owners.
For my spouse’s family’s needs, it was a little bit easier as the change and passing wasn’t as fast. We were able to, as a family, make decisions and get the proper settings changed in order to make sure that data and access was not lost. This side of the family uses iOS and macOS exclusively, and those platforms along with iCloud have features like Legacy Contacts that enable access to data if there’s ever an occasion that immediate access is lost. This was easy, and included two of the siblings in the family. We also did something similar with the Google account they used, to ensure that access to the iCloud’s email address was secure. This is important because so many necessary services like internet, banking, utilities, and other household needs are attached to email addresses and phone numbers.
2FA is a good, or great thing, but boy can it be tricky when you want to do simple things sometimes. For example. I was trying to enable a feature in iOS but there was an old iPad mini on the account. This device was using iOS 12, which didn’t support the feature, so it had to be removed from the iCloud account before it could be enabled, which is a bit strange, but understandable. However, when I tried to remove this device, I was unable to use the passcode. After several tries I had to add my face to the device and only then was I successful. Very odd, and possibly controversial, but for other reasons that I won’t go into here.
With all of this set up, we also decided to change the phone passcode. It had been left in the hospital with the person, and family members, but there was an incident where data was removed from the phone. Yes, it was a conversation with another family member, and all data was removed from the conversation. I consider this a violation of trust, as only one side of the conversation is truly private. This prompted us to change the phone’s passcode and not share it, which then caused some controversy. Be prepared for this, but also have just cause.
Along the same lines we did change the password for the computer in the house, which only had a password to keep younger grandchildren from installing programs on it unattended. I can’t say that this has had any controversy related to it, but suffice it to say that the sole person whose issue it was with this can probably assume that one step was too far.
The next steps have been slower and more deliberate. We now have the Apple and Google and internet provider’s logins and passwords saved on the legacy device and that of the still living spouse. As of iOS 18 and macOS 15 there’s a new Passwords app available. It’s very good at the 90th percentile of what any user would need in one app, and we’re putting this to use by making sure that the login data is stored on iCloud and is now shared between accounts and with other family members. One interesting feature I found only last night is that you can set up a passkey in a shared folder on another account. Neat.
I highly recommend using Passwords, even if you’re used to LastPass, 1Password, or other apps. I’ve been very satisfied with the transparency it’s enabled, especially in regards to what was a massive number of wireless access networks I’d connected to in the last 15 years. I must have deleted over 100 networks that I’d long forgotten about. This is wise because there are MITM attacks using devices like the WiFi Pineapple that can be very harmful with these devices and their willingness to connect to anything that looks familiar.
The Passwords sharing feature is something that we’re just starting to use. It does encourage Family Sharing, where you can share passwords with all members of the Shared iCloud storage. We opted to not do this, but to be more specific with the iCloud users and passwords to share. So far it’s working well, and we’ll be making changes to which passwords are shared and the accounts that have access to them. I quite frankly feel much better about using this as a method of password sharing versus a standalone app because many companies you know of also have been in the news for the wrong reasons. Share wisely.
Part two is likely going to deal with what you do when a family member passes and you want to maintain their accounts and communication, but their spouse isn’t even as technical as they are. It’s going to be interesting.
I’d been given an older Netgear wireless DSL/Cable router years ago. This included the box, which was nice, because I planned to sell it for $dollars$ and it usually presents nicer when you have things like that. I naturally forgot about it until it was unearthed months ago. I listed it and waited, as one does.
Eventually someone will come along and want/need the old, outdated thing you have. Or so my experience has taught me. I recently sold a pair of AirPort Extreme routers for OK money, but have also sold other 802.11n routers in the recent past which no one should use.
The WNR2000 isn’t special. It’s one of the upstanding type, so maybe that’s unusual, but it’s a black rectangle with lights on it, so pretty standard in that way. This is the v4, which isn’t important unless you find yourself in the situation I did.
When I was alerted that it had sold, for $dollars$ I figured that it was a good idea to power it on and check to see if I’d reset it. What I was greeted with was a flashing amber light after a few moments. This seemed to be a bad thing, and the NETGEAR%% wireless network that was listed on the back wasn’t showing, nor were the lights on the front that would indicate that wireless or WPS were working. Oh boy.
I seem to have recalled booting it up and trying to update the firmware, but that’s not to be trusted. Maybe this is why I was given the router? Honestly, that didn’t matter. I’d prefer to sell a useful item rather than alert the person who had paid for it that a refund was in order. Some time could be devoted to this, and I am not uncomfortable in these situations when there is a solution to find.
Lots of searches and a few videos later, I was pretty sure that Netgear had a solution. I found that this was either a power issue or corrupted firmware. I was using the beefy 12V 1.0A Netgear adapter that it came with, so that wasn’t it. So, how do I fix this firmware thing? Well, Netgear didn’t make it easy, and as anyone who has ever done any firmware hacking with routers knows: versions matter.
I found an insightful video detailing how to use an old protocol to transfer the firmware to the router using TFTP. I hadn’t used this in a very long time, since the days in school of sending files to Cisco routers. I was pleased to see that usable TFTP programs existed, so I could at least skip the command-line syntax minefield. What was frustrating , however, is that Netgear didn’t seem to make this easy. Sure, the firmware was right there, including many older versions back to the one it shipped with. However, instructions were not clear.
Then I stumbled upon the right search phrase and voila, here we go.
The instructions are pretty clear, except for one step, but otherwise I was pleased at the result. I followed the steps here to send the latest firmware file to the N300 WNR2000v4 and it’s back up and working:
The process is quick, if you make sure to not PUT before you should. I tried this, and then realized the directions wanted me to restart the router. After following those steps, PUT acted fast and the router restarted. After a few minutes I was greeted with a solid light, and then the wireless and WPS lights. I checked and found that the NETGEAR%% network was available. Going to 192.168.1.1 on the device connected to the router displayed the setup page. Just to be sure, I powered it off, waited, powered it back on and was relieved that it booted again.
Now it’s in the mail, headed to its new owner, thoroughly reset. I have a little more knowledge of how to do a recovery of this nature, and a new tool to try and remember.
This one is work related, and involves converting analog media to digital at scale. It’s been going on for many years, and that usually means that things have to change.
We initially used an elgato Video Converter for this task, with macOS, and it worked well for years. However, this was the only use of the iMac and we shifted the unit to a Windows 10 computer. It worked exactly the same and was reliable.
Windows 11 introduced some enhanced security including what they call Core Memory Isolation. This is a good thing, until the software you’re using wants access to that memory area and gets really upset when it doesn’t work. Elgato hadn’t updated the software, which is available on their website, in many years for the Windows users. It has updated it for macOS, but that’s not a path we’re going down.
Core To the Problem
So, the task of figuring out what to replace the elgato converter with was a few quick searches and two items were purchased. One half of this arrangement takes the composite video and stereo audio from the analog source and converts it to HDMI. These units frequently bill themselves as upscalers, taking a 480p signal and bumping it to 720p or 1080p. This is fine, but the video output we want is 640×480.
https://a.co/d/9tRv0uW
The second part is another device that intakes an HDMI signal and splits it into an audio and video stream. It’s very simple, and it works great in theory. There were some hiccups, however, and that’s where it gets interesting.
https://a.co/d/0G3wg0v
We needed to use different software for the video conversion, moving from the elgato software to OBS. It’s well-suited to the task, and with some simple instructions it was ready for others to use. Pick a video and audio source, change the resolution to 640×480, and start recording. Simple. It worked. Until someone compared video done with the new setup to the old one and raised a red flag.
Audio is the culprit here. No issues with the video, but there was something very off about the audio. I initially attributed this to reverb in a large speaking room on the tape in question, but when hearing the original transfer I got quiet. It sounded far better. This wasn’t just a small difference.
Cheap for A Reason
Interleaving is a way to do two things at once. In modern technology it’s most commonly been used visually with older televisions, where they would draw one line on the screen, skip the next, and then reverse the pattern the next time. It even got to the 1080i resolution in the days before LCD TVs were really good. Audio, however, is a format where it may have been used for years but only in the chain of processing. Not audibly.
Research into the HDMI-to-USB converter alerted me to some odd settings in the USB Audio Device that was shown in Windows. It was displayed as having 1 channel at 96khz. This was bad for two reasons: mono isn’t what I expected, and the sample rate was double what it should be. Others had noticed this and it seems that the audio chip on these devices interleaves the audio signal from left to right, meaning that there’s a stereo, 48khz signal in there but you have to process it to get that. This device had no drivers, nor were any seemingly available on looking.
https://a.co/d/7RHZTfC
I did find that some owners of components using this audio chip had written code that could untangle the two channels, and in testing I did find that it worked. However, due to the technical nature of the process I decided to pass on this as a solution. More research and I found that NZXT made a device called the Signal HD60 that not only had a page with specifications on a website, but stated its stereo output at 48khz.
A Fix
It was an easy install, and I tested it briefly and it sounded fine. To me. I asked that it be tested and around a week later I heard back that it sounded exactly the same. Dubious, I asked to be shown the new and old examples. They were right, and I was scratching my chin again.
Remember that there are two devices doing the conversion, one from RCA to HDMI, and the other from HDMI to USB. I’d only replaced one of these, and either or both could be the offending device. I did some quick looking through comments on the Amazon listing for the RCA device and audio was mentioned many times, negatively. Super.
So, now what do I do? Test to see if the NZXT device is truly passing DVD-quality stereo audio through? Test then again the RCA device to make sure that it’s at fault? Replace the RCA unit with something else? Yes?
Testing
I started by checking the old HDMI/USB adapter using an iPad with an HDMI output dongle. Inputting this to OBS resulted in no stereo recording while using a stereo panning video on YouTube. I tinkered with a few other things, but the audio echo and general sound were very familiar. Very bad.
That resolved, it was off to then test the NZXT Signal HD60 device in the exact same way. Similar video, but longer and with more range, performed as I’d hoped. Stereo separation and some of that neat psychoacoustic stuff to go with it. I expected this, so it was time to move on to the RCA/HDMI adapter.
Here’s where I had a short mental block, which lifted when I realized that I didn’t need to get video in to test, only audio. An older iPad, one with a 3.5mm headphone jack, and a cable with stereo RCA ends was all I needed to confirm what I suspected: the RCA/HDMI device had the same problem. It did not.
To my surprise, the RCA/HDMI adapter performed admirably. I wasn’t disappointed, because it meant that the problem was with the software. I asked for the original source material and did some live, monitored testing. I heard what I hoped to, and confirmed with the person responsible for the project that it was good. I then had them sign in to the computer and we repeated the testing on their OBS profile.
Ah Ha!
Some confusion set in when we monitored the audio and it sounded good, but when the recording was played back it had an echo. A quick search let me to a video explaining that it’s best to mute other input sources when recording or streaming from a single device.
Oh! Yeah. The thing I did on my profile but that we didn’t include in the instr for others…oops.
I did some more testing and found that it was indeed the Mic/Aux and Desktop Audio that were contributing to the audio and creating the echo. Muting those is very easy and a thing that I should have included in my instructions from the start.
A Releif
I was pleased that no additional hardware was necessary. Software can be fixed and reconfigured, and guides updated.
So, as a lingering lesson this should be something for people to keep in mind for solutions to problems that seem a bit to cheap compared to the brands with names you hear of.
As mentioned in the previous post about WiGLE and the AtomGPS device, I used phones as my only source of gathering with the tiny exception long ago of ministumbler. I have, since that post, become more enamored with the AtomGPS itself especially with the code improvements and community around it. I did want more, and especially other frequencies, so how would I be able to do that?
Another phone, of course! No, really, I had a blind spot for the newer WiFi 6E spectrum, and in discussion this year with El Kentaro it seemed like the Pixel 6 series was a step up from what I was using in several ways. Not only was it a faster device, with custom Google Tensor silicon, tons of RAM, and direct support, but it had the radios I wanted to look into the third spectrum. So, the search began, then stopped, then started, and stopped, and I finally picked one up.
Is it great? Mostly. Is it seeing more networks? Yes. Would I recommend it as a single device? Definitely. Is it the subject of this post?
No.
While having a “main” phone device is a good thing, because it’s self-contained and has many functions apart from WiGLE, there were still other projects out there that piqued my interest. This is a big community, and its members have many ideas, some of which are approachable (AtomGPS) while others are insane and awesome (Lozaning, Mr. Bill, BusySignal, others). I wanted something that could do what a typical phone could do but be dedicated solely to the task. This meant GPS, Bluetooth, 2.4 and 5GHz. The big one that I am always willing to pass on is GSM because, well, who’s counting those?!
Sweet Spot
I’d seen hints of several neat standalone devices, but one kept surfacing as the most interesting option. The JHewitt devices, as they’re known in the community, are compact units based on a PCB designed by the owner of wardriver.uk . This PCB allows for the use of two ESP32 boards, a GPS module, temperature sensor, GSM module, a microSD card interface, and an LCD. This is all housed in a 3D printed enclosure with two or three protruding antennae.
Image is of the rev3 plus the RTLBW16 5GHz module – photograph by Luke Switzer
This unit can then be built around the PCB, using the BOM, and used for wardriving as-is. The larger antennae, at least compared to a typical modern smartphone, should be able to hear those fainter transmissions, but in the basic setup they only listen in on 2.4GHz, BT, and GSM. This was a drawback to me because all modern wireless systems use at least two frequencies, and I wanted more. When I got word that a 5GHz mod with the BW16 was available my ears perked up and I was now very interested.
One of the primary code developers for the AtomGPS Wigler build, Luke Switzer, had shown off some of his builds in the past, but also mentioned that he had some components ready to assemble for eager users. This piqued my interest, and after a few inquiries a build was agreed upon. It turned out that this was the first of this configuration for Luke, and I’m happy to say it was featured on his Twitter account during the assembly and testing. I was very eager to get this custom build and join the crew of those who rock a fully capable wardriving rig that’s self-contained.
I do mean it when I say that this is a standalone unit because it features some incredible software and hardware. The most loved feature of this build is that it has two 18650 batteries powering it, with charging built in, and it’s fully protected from overdraw. I really like the AtomGPS, but powering it from an external battery is a hassle. This community-sourced solution not only means that the 3D-printed case fully supports the battery module, but it’s one cohesive unit that has the feel of a small FRS or HAM radio. This is a good thing, because it’s meatier in the hand while being very balanced.
A side-benefit? The diymore V8 has an external USB-A 5V port for powering other devices. I think that I’m going to use this for powering my AtomGPS once I figure out how to keep it all in one package.
On the software side it’s even more interesting and impressive. The initial setup is all through an onboard web interface where the new user connects to an onboard, ad-hoc AP and web server. It prompts for a known network to connect to for NTP sync and software updates, and a fallback network configuration for use after the initial setup. This works very well and the LCD interface is very good at informing the user about what the unit is doing. This display information includes the IP address, if it’s connecting to the server, and other information prior to the typical status display.
Once set up there are several features that make this almost as good as a phone. When it’s connected to that configured network, prior to fully booting, there’s a 60-second timer before the web server shuts down and scanning begins. This timer reset if a user connects to the displayed IP and makes changes. Some of the neat features also include adding your WiGLE API key for manual uploads, and there’s also an option for it to automatically upload as well! If you want to download the files individually and upload them separately, this is also an option that’s pretty easy to do.
The hardware is brilliant, with several case front options. I chose the more temperature-friendly one with the hexagonal design, favoring airflow over style or weatherproofing.
Using the JHewittrev3 (mod)
Startup from off is a single tap of the right-side button. Booting shows the software version, connecting to NTP, the wardriver.uk server if enabled, and the option to press the power button during boot to reset to initial settings. Following that, and if a known network is found, any pending uploads will take place automatically if configured. Then starts the 60-second countdown for the web configuration, displaying said timer and local IP address. Make sure that you configure the device on a network with peer visibility…ask me how I know.
Finally you will be greeted with the rev3’s dashboard, displaying seen networks, bluetooth, GPS status, date, time, and temperature. For those who can see it unassisted you’ll find that there are no stats akin to the Android WiGLE client where it’s doing database comparisons. We’re just collecting networks and not doing any data analysis here, so you won’t see any statistics for your run…unless you want to fork the code and take that on yourself!
This fully-stocked unit is easy to handle for those with medium to large hands. The grip, as mentioned previously, is like a large FRS or HAM radio. It’s well contoured and light enough that walking with it in hand is easy. I have done this on several walks, antennas horizontal, and not found any fatigue. I did also add a small Scoche magnetic plate to the left side for car mounting. It’s mostly been used in a vent mount and hasn’t overwhelmed the mount or the vent.
Battery life is good, considering you have two big ESP32s and the BW16 onboard with a display and GPS. With what I expect to be a full battery charge I have exhausted the battery over the course of a long day. On my unit there is no way to see or measure the status of the batteries, but I have become more willing to power it down when I’m not moving or in a new area. This increases the number of files, but it is a good idea if you’re away from charging capability.
On that note, this diymore battery pack would be even cooler if it supported hot-swapping of the 18650 batteries, but its protection circuitry doesn’t allow for that. If you pull a battery out and place it back in the holder it will almost certainly not then be drawing current. To be fair, I’ve not tested pulling out one, putting it back in, then taking out the other. However, I’ve read the documentation and user comments about the required power cycle when charging. If you plug the pack in and the unit is on, it will charge and run as expected, but pulling the power will force a power cycle. This could be used strategically with an external battery pack to extend the run, of course, but remember that unplugging it or turning off the pack will restart it.
Performance
So, how well does it perform?
I’ve been using it with the AtomGPS and the two Android phones for a week now and I am impressed. I’m in the habit of uploading the Moto first, then the Pixel, then the rev3. I’ll add the AtomGPS less frequently simply because of the work involved. So far I am impressed!
If you are looking for a BT sponge this is not your device. The Pixel 6 and most phones will do a much better job at finding those than an ESP32 or BW16. As for the other numbers, well, they’re impressive. I’ve been seeing very solid network observations when analyzing the WiGLE upload statistics. The rev3 always has significant new networks to add even after the Moto and Pixel have submitted theirs. The antennas being a higher sensitivity is giving it a huge advantage, and making each trip a little more effective. This is what I wanted, and it’s really boosting my upload totals.
This is a sample of a recent run, with the Moto on the top line, Pixel in the middle, and JHewitt rev3 on the bottom. Those are some impressive numbers!
One of the neat things that having external antennae allow is for bigger, or different designs. Do you have an old AC1300 Nighthawk in a drawer? Grab some of those cool blade antenna and swap them in. Have an old Yagi in a closet, or maybe a magnetic omni? Let’s go for a roadtrip!
Conclusion
If you are new to WiGLE this isn’t the place to start IMO. The numbers-go-up pleasure from the Android app is quite alluring, even years later. Driving around a new neighborhood and visually seeing a change is nice, and you get a better understanding of what’s out there. It will drive you to get more, or better, devices to do one of two things: more input or consistency. Some phones are just bad, while others are mysterously over-performers. However, if you don’t have a phone with you, you’re breaking rule #1: Always Be Wardriving. Having the AtomGPS in a vehicle or backpack is easy. Same with a phone.
One of the most puzzling things about the Apple Silicon MacBook Air in 2024 is that it can only natively support one external display. This even extended to the early 13″ MacBook Pro, which was just an Air with a Touch Bar. I’ve supported many of these devices since their introduction, M1 and M2 chips, 13″ and 15″ sizes. As of this post, there’s only one way to get full support for multiple displays.
Synaptics has a technology called DisplayLink, which they license to many companies from StarTech, Hyperdrive, Pluggable, and more. Download the drivers from their website and it tricks the OS into sharing a virtual display with the adapter, letting a user have multiple displays. It works quite well, but has it’s limitations. One of these popped up recently when upgrading from macOS Ventura (13) to Sonoma (14) and using the Zoom accessibility feature. It’s something that I use often in order to make text easier to read, and is pretty important, so when the display connected to my StarTech DisplayLink adapter stopped zooming I was a little puzzled. But I wasn’t surprised, because Apple has been tinkering and changing things for years and breaking capabilities like this.
In an accidental way I was able to find a solution, however, because it is possible to not only support Zoom on both displays, but unlike in Ventura, each display seems to magnify independently. This is a change from Ventura, where the combined display width and height were treated as a single display. Zooming in on one display or the other would also zoom the other, which wasn’t ideal, but worked.
In switching from a Hyperdrive dock to an older, first-generation 13-port OWC Thunderbolt 3 dock, I was unsure which display was connected to which cable, This dock is very odd in that it’s only display outputs are Thunderbolt 3/USB-C DisplayPort and a miniDP plug. No HDMI or full DP plugs. I was using a miniDP-to-DP cable on Dell displays, but found out that I needed to move it. In this confusion I found that the DisplayLink adapter had grabbed the top bar, making it the primary display. This isn’t the monitor that I would normally use as primary, so I was going to change it but found myself using the zoom motion to make the System Preferences panel larger and had to pause and think about what just happened.
After the Sonoma update, this adapter being set to the secondary display, it no longer zoomed. Instead, it glitched a little bit and stuttered. That it was working was unexpected, and so I checked to see if somehow the issue had been resolved by setting things back up as I had previously. Primary was from the dock, while secondary was from the StarTech adapter. The issue returned. So, I switched it back and the zoom worked as expected and hoped. I thought about using my second display as the primary monitor, and even set it up like that for a few minutes, but thought about the whole situation again.
Why was the DisplayLink adapter working when it was primary? I don’t know, but there must be some kind of priority and feature enabled when it is. Thinking for a moment I then decided that switching the two display inputs couldn’t hurt, so that’s what I did. It worked exactly has I’d hoped, and is very much a solution to the problem I’d has since upgrading to Sonoma.
As mentioned earlier, I can now zoom in on either monitor independently of the other. In this window I’m zoomed in, while the other is zoomed out to the full, native resolution. There’s no choppiness or lag, which I did admittedly remember happening on the DisplayLink adapter under Ventura. This behavior is why I wouldn’t have considered the StarTech USB adapter fit for the primary display. Now, however, it works easily as well as the display connected to the dock.
I had emailed Synaptics earlier to ask if this was a known issue and received a response that it was. The software as of this writing is 1.10.0 Build 117 and was released in October 2023. No beta is available. I have emailed that same contact to inform them that there is a possible resolution for some users and I hope that they will acknowledge this and that it’s already a known fix.
I’ve been part of the WiGLE Project, as a contributor, since 2017. I have a much longer history with the idea and practice of “wardriving” that extends to the early days of wireless internet as a thing. Wardriving was something we would do with a PCMCIA wireless card, an antenna, and a laptop. Or in my case, a portable PDA running Windows Mobile. Most of the data contributed to the greater project is done so with Android phones, which makes sense because they have good antennas, GPS, and respectable processors. Things are getting weird though, and some devices that are neither “computer” nor “phone” are becoming more popular in the fringe. Let’s talk about my step into this new world.
More creative members of the community have latched on to the small, cheap, and powerful ESP32 devices that are increasingly competent and accessible. They’re easy to code for and program, a favorite in DIY projects, and are turning up in some compelling packages. One such project is the M5Stack AtomGPS device. It’s a combination of the M5Stack Atom Lite ESP32 unit, plopped into a saddle that includes SD and GPS, which is a critical part of the WiGLE project.
These are affordable, when they’re available, and cost around $30 USD each. This includes the Atom unit, the GPS saddle, and a USB-C to USB-A cable. Add a 4GB-32GB SD/TF card and you’re good to go for hardware. I obtained mine from Mouser, but other sources include DigiKey. Software, well, that’s a different story. Buckle up!
I do not regularly use ESP32 devices, nor Arduino, but I am loosely familiar with the software IDE. I chose to use Windows for my firmware flashing platform, just as a matter of availability, so here’s where it got interesting.
Note that I am an ESP32 amateur. This guide is merely a verbose interpretation of the README.md. Refer first to that guide, which is updated frequently, and follow this one if you’re looking for another option.
Installing Git is easy, especially if you choose the Portable installer. Using the instructions here, or the code above, you can clone the repo. This will place a folder called “AtomGPS_wigler” inside of your Git Portable directory, which is probably in Downloads if you’re using Windows. Why did I prefer the Portable version? It’s quick and easy to use and doesn’t integrate functionality with your computer, making it perfect for a small project like this. The Arduino IDE is also a pretty straightforward install, and it’s an easy interface to use. Once that’s installed there are two very important things to add for the M5Stack Atom specifically: the board and the libraries.
Add the M5Atom libraries, and all of the dependencies, under the Library Manager tab on the left side of the Arduino IDE:
Copy the URL to the library json below, then go back to the Arduino IDE, into File > Preferences, and let’s add this Espressif board library in the “Additional boards manager URLs” area and click the icon to the right, confirm the URL, and click ok.
Select the Boards Manager tab on the left, search for “esp32” and install the Espressif Systems package.
Now make sure that you have an SD/TF card formatted in FAT32, MBR. 4GB-32GB are tested/supported. The slot is on the back side of the GPS saddle, and has that little click in/out mechanism. Plug the AtomGPS in to your computer and get ready to flash.
Open the repo clone file called “AtomGPS_Wigler.ino” and make sure that the board type is set to M5Stack-ATOM or M5Stack-Core-ESP32. I have both of these here because of mixed success. One should work as expected, while the other may fail.
Restart the Arduino IDE.
Click Upload in the IDE toolbar and wait for the magic to happen…or errors. If things go well you’ll see a white cascade of the flashing process, after which the device will be rebooted. If all goes well, you should see a purple, then green LED flashing on the unit. If it’s red, well, that’s a problem. In my case this meant an older problematic version, which I will cover a little bit later. Future versions fixed this issue and the purple/green is what you’ll be greeted by.
Okay, yes, that’s all it does. It powers on, it gets a GPS fix, and it scans. Lines are being written to the SD card in a unique file per boot. These are compatible with WiGLE and can be uploaded at your convenience or the next time you see that hot shot delivery van driver you gave $20 to put it in the glovebox…
Now the story begins, as does the other way to flash the AtomGPS:
All of this, and my first try at a pair of new AtomGPS units failed. I spent more time than I care to total in troubleshooting why, until a fellow user pointed out that the code developer noted some reports of SD card detection issues in the older 1.3.2 version. This is the one I was trying to use, and instead of making changes to the code to fix SPI, I decided to use esptool, the alternative way to directly flash a .bin file.
There is a driver from FTDI that I can recommend downloading and installing from here and it is referenced from the M5Stack Github. I have found this not to be necessary, however it’s a simple install.
To use esptool I simply opened the Windows Store and installed the latest version of Python available at the time (3.xx). This done, I opened a command prompt and installed esptool with the “pip install esptool” command, which took very little time. In some environments it would be possible to run “esptool.py” if environment paths were set up properly, but instead I used “python -m esptool” with some arguments:
Note: versions mentioned below have been superseded by 1.4 and above
Make sure that when you’re copying this code that you replace [PORT] with something like COM3 and do not include the brackets in your script. Ask me how I know that it doesn’t work with brackets…
This directly flashes a .bin file, which needs no libraries or dependencies as It’s a full image, and if it writes successfully then the firmware is good. This done, my first of two AtomGPS units was working with a 16GB SD card. I totally expected the second to work just the same, but it was not so easy. Using a 4GB card I was unable to get the second one to work with esptool and version 1.3.2. Frustrated, I flashed 1.3.1, which was in the Git clone folder. No luck, but in some frustration I swapped in a 32GB SD card and it booted up immediately. This was a surprise.
So, what would you do next? Flash 1.3.2 of course. Did it work? No!
So, after a flash back to 1.3.1 and a few minutes of running, I checked the SD card for the AtomGPS csv files which were now present. It’s not a matter of how well it works, at this point, but that it does. The developers and community will add features and improve some of the code, but there’s not a really compelling reason to make sure that the devices are up-to-date. So, in the interest of not spending any more time on these in the immediate future I’ll be content with two of the AtomGPS units running different versions of the software.
Lessons learned? Plenty. I should have checked with the developer to see if there were any known issues. This would have saved me a lot of time, but it wasn’t all in vain. Having more SD cards available is certainly a plus. Knowing that the developer provides older .bin files in the Git clone was a handy thing because I could fall back on older code to avoid new features and potential issues.
Would I recommend an AtomGPS or other dedicated WiGLE device? Yes. These are super handy to put in a vehicle or somewhere that either has a lot of traffic or itself travels a lot. It’s not necessarily a device that needs a file dump and upload daily, or even weekly. I’ll be putting one in each car and uploading the files when I think it might be good. I’ll also keep using a phone too, because you can never have too many antennas…
Thank you to Luke Switzer for the software development, Lozaning for inspiring it, and pejacoby for showing off the project on social media.
Lenovo Thinkpad Universal USB-C dock firmware issue with AMD-based E-series laptops.
Issue arises when users or admins upgrade the dock to the .91 version using Vantage, Dock manager, or manually with a downloaded executable.
I can attest to seeing a working dock have the following issue after the upgrade from a previous version.
The primary issue with the 3.0.91 firmware is that any displays attached to the USB-C dock will fail to work with these AMD systems. In several cases, with multiple generations of the E15 we saw this happen.
Running the updater will give errors, mostly that it can not read the current version of the various hardware revisions, and it will eventually fail.
Other laptops will work with the dock, both Windows and macOS alike. The dock is usable for USB, ethernet, and audio, but video will be non-responsive and not shown as available.
Resolution:
Firmware downgrade!
Using this recent Reddit thread I was able to get the 3.0.80 firmware from Lenovo.
Using a Dell laptop with USB-C I was able to successfully install the previous firmware version and downgrade to 3.0.80. All functions return and the dock is completely usable.
I’ve done this without issue on 4 of the 40AY USB-C docks and all devices which were not previously working are now operating properly.
Update – 01/08/24
Lenovo has since released the 3.0.92 firmware update for these docks and I can confirm that it avoids this issue. One dock that had been downgraded was successfully upgraded from .80 to .92 and all functionality worked. Another was a recent .91 firmware install that I was unaware of, a successful downgrade, and then an upgrade to .92.