I’m lucky enough to be part of an extended family that has property in SW Wyoming. It’s somewhat close to a very famous city that millionaires love to buy homes in and drive prices up. It’s a lovely place made of actual timber from Oregon, and it has enough space that a lot of people can be there.

Surprisingly to me, when you can walk out the door and be in a national forest, people prefer to be on their phones (DSL there works) or watch movies (instead of enjoying the night sky). Because of this desire, most of the family has sent their DVD and Blu-Ray collection up to live there. The problem with this is that DVDs suck, they are usually scratched and will only get worse, and browsing them in a sleeve is tedious.

I decided to make it my task over the last year to rip all of the media I could get my hand on, whether owned by my family or the extended one. Add to that all of other content which I’ve come across during the years and now we’re talking. It’s been an interesting, slow process, but it’s also allowed me to figure out what software like Kodi can do if you give it something to work with. I had never previously used any scraping tools, and now I understand why these exist and how best to use them.

I started the hardware side of this modestly, with a spare RPi Zero W and a copy of LibreELEC installd from NOOBS. This seemed like a good platform to start with because Kodi on a PZW doesn’t need to do much but make it’s shares available. It’s not streaming video out, just streaming data over wireless to other devices. Some tinkering led me to find that there are editable files like samba.conf which can make this sharing even more deliberate. For example, by default Kodi shares all of it’s directories over SMB. This isn’t useful to a septuagenarian on an iPhone with VLC browsing directoreis. Being able to explicitly share only the folders you want is very nice. It’s also handy to make everything read-only because, well, duh.

The PZW was working fine, but it can only possibly power one external drive at a time. I got a neat 4-port hat for it that expanded it’s capability, but without a second power adapter, running more than a single laptop drive wasn’t possible. I was also restricted by the drive sizes I had, the largest of which was 750GB. As my content went over this, I started to think that something more streamlined was a better idea, even though the tinkerer in me loved the PZW solution with Kodi.

An older QNAP TS-119 with a 4GB WD Red disk came up for sale locally and I pounced on it. QNAP still supports these devices with security updates nearly a decade later, and it’s an interface I’m familiar with. Some tweaks here, some very patient copying of 1.5TB there, and now I have an all-in-one solution to the storage end of the equation. It’s only a single-core device from 10 years ago, but it works OK. It spins down the disk if it’s not used, so it can be relatively power efficient. It has a power button, unlike a Pi, so turning it on and off isn’t a big deal.

On the front end, that’s where a lot of time was spent. Scrapers that use TVDB and TMDB rely on a properly named file, usually having the year, and most things to the proper syntax. If anyone has ever happened upon a TV series from an online source you know how they’re serialize in a Season/Episode format. This has been standardized, but it does take a bit of time to find and edit episodes of TV shows and getting the full names of movies copies straight from TheMovieDB. If you do that, and tell Kodi that your video folder contains Movies, it will go look for that and download the appropriate metadata to get information like a poster, plot, actors, and such. It’s neat, but that’s where the next part comes in.

The process of using a scraper works pretty good, but it also falls apart when you are dealing with a library that needs to be rebuilt or cleaned. This isn’t a tedious task on a modern PC, but on an older iMac it can take a while. Now think about doing this on a quad-core Pi 3 and it’s a process that takes hours.

Best practice is to tinker on a faster computer with a fast connection and get everything set up. After you’ve figured out where things are going to be it’s time to add it all to a slower front end like Kodi/LibreELEC on a Pi or an older, slower computer. Telling these to update the library on startup is a good idea because you’re probably not adding hundreds of new titles or files, but if you are, that’s when it needs time to process.

Make a directory for Movies, Music Videos, Music, and TV shows. These can all have subdirectories e.g. DVD, Documentaries, Comedy Specials, VHS, and the like. Kodi will parse them all and get the appropriate data for everything it can find in each folder. TV shows go in their own Folder/Season with proper Season/Episode nomenclature.

It’s really neat when it works. Having a niece or nephew download VLC for iOS, open it up, and four taps later they’re knee-deep in the movies they grew up watching at home is pretty rewarding. Whether they’ll use it much is almost beside the point of making it a service in the first place and figuring it out. Right?

I was trying to use VLC 3.2.2 for iOS to get to a hidden folder on a local network storage device. Some time after 3.2.0, SMBv1 stopped working right, and the VideoLan folks are working on it, but it got me thinking.

The NAS I use has a GUI web interface, and is Linux-based, so it should have some logging, simple or otherwise. Naturally it was disabled by default, because of course it was, and when I enabled it to see what it offered, I was a little surprised.

One of the various tabs showed connection attempts. which is exactly what I wanted to see. I was a bit shocked, then amazed, and puzzled, when every few seconds I saw another entry come up. These entries were from external IP addresses. They were dictionary login names trying to authenticate via SSH to my NAS. Hmm. What the hell.

First, I logged in to my router and looked at the ports I’d had specifically forwarded. Many of these were for services I wasn’t using, mostly attempts to get OpenVPN to work properly. I shrugged and deleted them all, knowing that I could add them later if needed. That didn’t slow things.

Next up I looked at the SSH settings on the NAS and decided that I wouldn’t disable the service, but rather change the port. If I need to get a shell on the NAS in the future, I can always look up the longer number and connect. Bam, that was like closing the door hard.

Relieved, I also still had a bit of a warning that I should keep watching the log and alas I did see some FTP attempts at connections. Click, Apply, and silence.

Not completely satisfied, I did some more research and found that the router I use has some clever UPnP services that automatically let connections through. Ah, okay, this made sense but how was it being….

Oh, the NAS was asking for ports to be opened via UPnP. Click, Apply, sigh.

It seems like my efforts to get OpenVPN working some months ago and default settings on the NAS and router meant that I’d made my NAS vulnerable to external attacks. Now, I’ll admit that I wasn’t super clever in disabling the default admin account and making a super complex password, but it also seems like no complex scripts had successfully connected and found their way in. At least, I hope. Any logs the system had made, if any, were likely not saved. It’s a device that I use somewhat often, and exfiltrating or messing with the system would have run alerts from my internet provider.

Check your devices. Make sure they’re only asking for what they should be given. Look at some logs for an hour or 24. Change your passwords, even on “internal” networks, because they’re only safe if you didn’t accidentally poke holes in your network to make them more public than you wanted.

I’ve had a long history with the iPod. My first was a 5GB, Macintosh-formatted, first-generation iPod with a Firewire interface. The battery wasn’t very good, and those models had a bad habit of breaking the Firewire plug, so it’s good that Apple figured things out with the later versions.

My next iPod was a 40GB Video model. Open box from Best Buy, it was probably still over $450 way back then, but man was it nice. I used it on planes, for a radio show, and in cars. I sold it later to get another one, an 80GB I think, then eventually got the mother of all iPods, the 160GB, in 2007. iPod Touch was next on the list, and I never really looked back when touch screens, wireless, and an App Store were available.

We didn’t realize it at the time, but those who were fascinated with the iPod and audio quality became fans of the device because it used various versions of the Wolfson DAC. Other sites have ranked the best iPods for sound, but all of them are before the 6th Generation iPod, which seemed to be a turn in form from Cupertino towards profit over quality.

Most iPods that weren’t flash based had actual hard drives, 1.8″ units from many manufacturers. Larger iPods sometimes had deeper cases to acommodate the thicker hard drives. 160GB seemed like a lot, and it is, but hard drives have many deficiencies to flash storage in 2019. Along with drive capacity, the amount of RAM onboard also varied, so a 30GB and 60GB iPod would have different total track and metadata capacities. Fun!

A friend mentioned off hand that he had a bunch of iPods. He wasn’t interested in them, and so they were just simply available if you wanted them. I took him up on the offer, and three of the four happened to be custom etched iPod 30GB 5G units. They may be 5.5, but the awkward part number doesn’t make it clear. Regardless, I was now in possession of some great hardware with weak batteries and failing hard drives.

iPod “Enhanced” 30G aka 5.5 PA447LL/A

To go with that aforementioned Wolfson DAC, I bought a used set of Etymotic ER-4B in-ear monitors for serious sound isolation. These were to get an awesome experience by blocking out as much noise as possible, while giving me a powerful, high-quality, balanced sound. The iPod featured a relatively powerful amplifier, and I was surprised at how well it drives the ER 4s.

Storage has been solved by one company in particular. They offer a 1-, 2-, and 4- SD card upgrade option that replaced the hard disk with a PCB housing one to four cards. An mSATA option, and CF are also available if you so desire. The PCBs are frequently in stock and ship from the UK. It can take a few weeks, and I’d probably recommend the shipping option with a tracking number too.

So, I ordered the iFlash Dual, a 128GB, 80MB/sec SD card, a replacement battery, and some tools. One after another, they showed up and piled around the iPod. The adapter arrived last, fashionably, which let me put everything together in a matter of minutes. See, I’d already done the hard part…

Getting the iPod open is tedious, but needs care. I followed a guide on the iFlash website, another two on YouTube, and yet another on the iFixit site. The latter, especially, let me to the best sequence of working the back cover off that I’d seen. With a single spudger and an old credit card, I was able to get the rear cover off in minutes without breaking anything or bending it up. What a relief!

Hard disk removal was simple and easy. Battery was a little trickier with the adhesive, but a patient prying motion along the long axis of the battery got it out in no time. That done, everything was pretty much the opposite of removal.

Put the SD card(s) in the PCB. Check. Get the PCB lined up, with the SD card facing DOWN. Check. Slide the drive cable in the slot and lock it down. Check. Battery leads bent down at the end and re-attached. Check.

I did have a foible with the drive connector, so I had to do it again, but the second time of asking, the iPod booted up. iTunes restored the iPod, and I’ve synced over 11,000 tracks to a nearly full iPod that now has 128GB of capacity.

Total cost was under $100. It helps that the iPod was no charge. Batteries are inexpensive, and SD storage per gigabyte gets more inexpensive every week. The most expensive items was the SD adapter, but it’s also the most important!.

The sound is very clear, even at 80% volume. The menus are SUPER fast. I’m looking forward to better battery life, quieter operation, and more than four times the capacity.

Would I recommend this? Only if you have an iPod 4th or 5th generation available for cheap/free/in a drawer. I would hesitate to do this on the 6G or 7G unless you’re simply interested in carrying a HUGE amount of music with you and are less concerned about subjective quality.

iFlash Dual – $40 : from iFlash.xyz directly

128GB Sandisk SD card – $22 – Amazon

Lenmar iPod battery – $11 – Amazon

One of the pitfalls of getting previously-used equipment is that there can be an unexpected roadblock to doing something trivial, like a BIOS update.

While testing the INTEL-SA-00086 Detection Tool, which is a simple and easy way to see whether or not your system is vulnerable to the Spectre and Meltdown vulnerabilities (as they’re currently defined), I found a system that was running the A20 version of the BIOS. A22 is the current version.

Trying to run the update utility resulted in a request for me to enter the BIOS admin password. This is a piece of equipment that I got from another area, and while I could contact them, the Latitude E6530 is old enough that I figured someone had reverse-engineered the BIOS password reset algorithm. I was not wrong.

As soon as I saw the interface for the site that solved my problem, I recognized it after seeing it several years before to resolve the same issue.

Going in to the BIOS, and entering the Security tab, and entering an incorrect password gives me what’s called the serial number, or service number, in the format of “1234567-595B “. This code, when used at https://bios-pw.org will give you a bunch of different passwords to try. Thankfully, these older machines have no attempt countermeasures, so one can simply mash in numbers over and over.

The second or third entry was the key for my machine, and instead of letting me change the password, or just unlocking it, the code removed the password entirely. Handy! Bummer, however, for people who think that an admin password is any more than a WEP-level block against access to a system or changing of settings.

The neat thing about the site is that it seems to be based on research from a few people, and that it’s source code is available on GitHub: https://github.com/bacher09/pwgen-for-bios


I’ve been using WiGlE for some time (<a href=”https://wigle.net/”>here<a>) for Wireless Network Mapping since late 2017, when d4rkm4tter finally convinced me to try it. As an old-school wardriver, I was intrigued.

My old rig was a Compaq iPac 36- or 3800-series with a PCMCIA sleeve, a Lucent Orinoco Gold card, and either a Yagi or Omni antenna. Microsoft Windows Mobile and ministrumbler was used on the software side. I kept those logs till this year, when I uploaded them to the worldwide database.

WiGlE works by looking for wireless networks, Bluetooth, and cell phone towers, then trilateralating them with GPS. It only works on Android for mobile phone devices, and Google’s Android Pie breaks it due to severe limits on how often the wireless can be polled.

Astonishingly, I was able to find a nice Pixel for $100 from someone who’d purchased an S9 to replace it. Seemed legit, and all was well. It came with a Tech21 case and had a 3M/Scosche metal magnet mount on it, which was handy. The USB A-C cable and LG charger was worth $15 alone.

Now the issues with Android Pie are known about in late 2018, and there’s not currently a workaround to keep WiGLE able to do it’s job. The Pixel had been updated to Android 9, but hey, this is a Google device so it’s super easy to unlock the bootloader and flash Oreo, right? Right? Well…

For some reasons, most of which are stupid, but some of which make sense, Google has partnered with Verizon in the US market for Pixel sales online and in stores. This has undoubtedly helped the Pixel line sell more phones, but it enrages users like me who want to have the true Google experience, on a Google device, but without the barriers. I remember the pain of Sprint Nexus S and Verizon Galaxy Nexus owners went through, and I was hoping someone had figured out a workaround.

Some quick work to get the Android Studio and following some guides got my phone connected and I was able to shake off some cobwebs to get the proper components downloaded. fastboot was able to see the phone, but in my case, under Developer Options, the item called “OEM Unlock” was greyed out. My worst fears were coming true.

I rapidly searched for a workaround and came to an insightful, relatively clear, and eventually rewarding thread on the wonder of sites that is XDA. The thread can be found <a href=”https://forum.xda-developers.com/pixel-xl/how-to/how-to-unlock-bootloader-verizon-pixel-t3796030″>here</a> and I suggest reading the main post and comments before proceeding with an unlock.

Now, with this guide I was able to unlock the Pixel. The most important step I needed to keep in mind is that “OEM Unlock” one, and patience at that point. When I checked it after following the adb command, it was still greyed out, but after a minute or two, it lit up and I was able to successfully unlock.

After a download of the Android 8.10 image from Google, and following some other instructions, I used the “flash-all” script to get Oreo installed, and cross the finish line.

While the battery may be a little tired on this two year old phone, I still like it and will be using this alongside my Verizon G6, whose bootloader remains hard locked and to which no workaround has been found.

WiGLE works great, and I’m finding some interesting results between the G6 and Pixel, especially given that they’re both using the same Qualcomm Snapdragon 821 SoC. Must be some different antenna designs and implementations between HTC and LG.

The Raspberry Pi is a wonderful platform, not only because it’s well supported, but that’s as a result of it’s price and flexibility. Lots of USB ports on the 2 and 3 models, HDMI, Ethernet, and now it also includes wireless and Bluetooth onboard.

I have been meaning to experiment a build a wireless repeater using a Pi, and finally got a chance (and the courage) to attempt this feat. It wasn’t easy, and this is by no means a comprehensive guide, but rest assured that if you get it to work, you’ll understand why.

I started with a Pi 3, 16GB Kingston micro SD card, and an external Atheros AR9271 based adapter from TP-Link and made for a TCL television. I got this from an infamous friend, @d4rkm4tter of the #WifiCactus and it’s “high-gain” antenna gives it extra reach for connecting to distant APs.

Raspbian Stretch was used, the full version because I’m not a full cli master, starting with the April 2018 image, then eventually using the October 2018 installer (which is nicer).

My intention is to use the external USB adapter to associate with a remote AP and get a connection. This would be bridged by the Pi and dnsmasq with the built-in wireless adapter on the Pi. My testing shows that the 802.11n Atheros adapter with the external antenna does indeed have improved gain.

I followed the guide found here: <a href=”https://pimylifeup.com/raspberry-pi-wifi-extender/”>PiMyLifeUp</a>

All of this was very helpful, and the guide is extremely well written and intended for the Pi2/3 owners. My only change to this guide, or difficulty with, was that I wanted to invert the adapter role, so changing “wlan0” to “wlan1” and vice-versa throughout was necessary.

The packages called dnsmasq and hostpad are the real workers here, and to the heavy lifting, routing, and interfacing with the adapters. I had some success with just following this setup, but also found that it didn’t _just work_ so the tinkering and frustration curve began.

One neat trick that I did learn from elsewhere is that hostpad can be run with a config file as an argument, which means that you’ll start the service with a verbose console feed, but it will let you know if the config file is working and whether the AP has started. When you see “AP-ENABLED” and few, if any errors, you’ll finally know you’re there.

As of this writing, I’ve been running the AP for several days. Performance is weak for throughput, at less than 10Mbps, but the application this is meant for, or would be used in, is a location that has very poor cellular coverage for non-Verizon customers. This application of the Pi can be configured in a place or situation where _some_ connection is an improvement over _none_.

Interestingly, the PW-4210D adapter does have a removable antenna, so the use of a parabolic, omnidirectional, or yagi antenna with an adapter is possible for a very long run. There are also more solid, cheaper wireless bridges available online, but if you’re a tinkerer like I am, and have the spare hardware, there’s something special in feeling like you’ve made a bucket of parts do something interesting.

Go, do it.

Password managers like LastPass, Dashlane, Keepass, and 1Password (among others) are increasingly popular. Browsers, however, have been able to hold and store passwords for quite a long time.

Most of us use Firefox Sync or Chrome’s Google sign-in to keep things closer than a password manager is, with an eye towards simplicity rather than outright security. Sure, browsers have password requirements to use their vaults, but…

On a new PC, in Firefox, I’m struggling to get the browser to ask to save passwords. This is weird, as I’ve always seen the prompt when I enter a new password for a site, or update one. Not this time. In an effort to try again, perhaps thinking that a stored login was causing the issue, I clocked Remove All, and told Firefox to go ahead and remove all of the passwords.

Bear in mind that I’d put in text to filter this list down to two, a login item with and without “www” in the URL. So, it was showing a list of two items. Remove all would remove these two, right? Nope.

I got annoyed when Firefox stopped responding, and eventually got the message that a script was taking longer to run than expected, etc. I didn’t think more of this until I tried again to get Firefox to remember the site, and after it didn’t suggest saving the password, I checked the Saved Logins again only to find a completely empty list.

Oops. Breathe.

New PC is less than a month old. Old PC is right there. Sync loves to be tidy, so make sure it’s not connected to a network because, sure as anything, it’s going to remove them from that PC too if it can.

Open Firefox on the in-tact PC, enter “about:support” and look for the Profile Folder entry and click the Open Folder button. Search for “logins.json”, “key3.db” and “key4.db”.

I copied these to a USB stick, put that in the new PC, immediately put a backup elsewhere in the cloud, then closed Firefox on the new PC after opening that same profile folder on thew new PC. After copying all three files to the folder on the new PC, I held my breath, started Firefox, checked in Options for Saved Logins and saw a full list again.

Whew.

Now to get LastPass installed and export these to somewhere else more secure and cull the list of heavily-outdated passwords.

Oh, and yeah, figure out why Firefox isn’t asking to update or save new passwords…

I’ve had the unusual opportunity to get several used, late 2000s HP printers for use at work. Normally we get these new, fresh out of the box, and they’re maintained from that point. This has led to some interesting issues with firmware updates.

Anyone familiar with updates to firmware on HP printers made in the last decade knows that there are several ways to do these. The first is to install the printer locally on a Windows computer and run the updater program against this installed device. This works, and works well, but only if the firmware on the device supports it. Second is to use a built-in updater located on the printer’s web server, which is a feature on the higher-end Enterprise devices. Third is via FTP. Yeah, FTP.

After installing the printer as a local device, I have seen situations where the installer program will not work with a network printer object, whether it’s WSD or a TCP mapping. Usually this is resolved by plugging a device directly in to the printer, installing the local version, then running the program.

When this doesn’t work, however, FTP is still an option. It’s simple, and easy, but also a bit scary because this port is open by default, has no username or password, and allows a binary transfer directly to the devices. Even after a firmware update to a version less than 18 months old, this port remains open.

This works, and gets around the frustrating update process that most end users would use, but hey, at least isn’t not TFTP.

Today we shut off zZq’s jabber server after 12 years. Unsurprisingly GTalk/Hangouts started off offering federation with independent XMPP servers only to disable it early 2017 after most people had migrated to Google. Sadly this more or less rendered zZqIM useless. Only 3 users regularly logged in, which didn’t justify the cost of the server to keep it running. The last holdouts have finally migrated to Hangouts.

With the money saved we will most certainly use it for alcohol and pour one out in honor of another lost friend.

 

zZqIM circa 2006

I have tried, and when the iPhone X is replaced by a newer, larger version, I’ll be back, but till then…

The iPhone 8 Plus is larger than I want it to be, but the screen is more important than FaceID, or the swiping gestures, or the one and only time I sent an Animoji to someone.

the [PRODUCT]RED iPhone 8 Plus is perhaps the best looking iPhone of all time, and it will be mine. If I’m going to pay an extra $10 per month to be in T-Mobile’s Jump! OnDemand plan, I’m going to use the hell out of it and have zero regrets.

The “RED” phone on the 8 and 8 Plus has a black bezel, which, let’s be honest, is THE RIGHT COLOR. White bezels are for basic people who like Rose Gold and Silver and whatever other weak colors are out there that aren’t Space Grey.

So, the minute I can switch from the wonderful, but just-a-bit-too-small iPhone X, I’m going to. Likewise, the minute I can switch from the 8 Plus to the X2 Plus (or whatever it’s called), I will.