My History of Home Computers

My first computer was a TRS-80 model 1, 16KB RAM, cassette tape drive for data storage, 16 row, 40 column black & white monitor, uppercase only, all built into the keyboard. I still remember the taxi ride home, because the whole box was just too large to carry onboard a city bus. I quickly upgraded it with a third party expansion box, to upgrade to 48KB RAM and add the ports for an Epson MX-80 matrix printer, an Exatron Stringy Floppy storage tape device, and a Hayes 300 modem. The TRS-80 was a nice first computer, still worked when I threw it away many years later. Someone grabbed it from the street before the garbage truck came, so maybe its still in use.

My next computer was a Franklin Apple IIe clone with 48KB RAM, which featured two built-in 5″ dual-sided floppy disks, and a Z-80 co-processor board with it’s own 48 KB RAM and 16 KB ROM, so I could boot up CP/M and run WordPress and other open source software. The Apple platform was much more powerful than the TRS-80, graphics wise. I upgraded that with a 10 MB RAM disk, that was a lightning fast storage device that would lose everything if the power went out, so it could only be used to cache the OS image and temporary data.

The Apple II user community was tight knit and helpful, with lots of creative types releasing tons of magazines, BBSs, software and printer templates to let you use your computer to it’s fullest in the 1970s and 80s.

My next computer was an i386 with 8MB RAM and a 200 MB MFM disk, running MS-DOS. Eventually I got Desqview OS virtualization software, so I could run multiple DOS boxes simultaneously and switch between them at will. Windows 3.1 became a thing, but it was extremely primitive compared to what Windows has become today. a few years later, I upgraded that case to a new motherboard, and new AMD 486-clone CPU later on, and had a 2400 bps modem for a long time. Later on, work gave me a 9.6 kbps modem, and even later a 14.4 kpbs modem, and that’s when I started accessing the Internet from dialup. I bought a multiport serial card to hook up multiple modems for a bbs, and a SCSI-2 interface card to add a couple of 9 GB hard drives that were hand-me-downs.

It was on this 386 that I started running ESIX, a release of AT&T SysVR3 UNIX for i386 platforms. I even wrote my own BBS in the Perl language, and had a few regular users. I tried out the new OS GNU/Linux, by installing Slackware 0.99pl11. At work I was learning about BSD Unix, and at home I had DOS, Windows 3, SysV UNIX and 1st generation Linux.

Eventually I upgraded my desktop to a prebuilt HP Pentium (i586), where I ran Windows 95 and various flavors of Linux. I went from Slackware, to Red Hat, Mandrake, TurboLinux, Ubuntu, and finally to CentOS. My current favorites are Xubuntu on desktops and CentOS 7 on servers.

I didn’t latch back onto the Apple train until my boss gave me a Mac Pro workstation to take home and learn why they all liked it so much more than windows and Linux. Didn’t take me too long to come to the same conclusion, that when you use a Mac, you spend more time using the computer, and far less time mucking about with settings, drivers, upgrades, and other hassles. You still need anti-virus software, and reliable 2 level backups – local and remote/cloud.

These days, I’m getting tired of Apple’s dropping reliability rates, and constantly rising device and repair costs. After the Spectacle and Meltdown disasters, I want to wait for the next generation CPU to come out, and replace my all-in-one iMac with something of the class of the AMD Ryzen 7 1700x, GTX-1060, and just go back to running Ubuntu at home.

I’m perfectly fine running apps as containers using LXD and Docker tools, and using VirtualBox to run any other OSs I want. I will lose access to some music I’ve purchased through iTunes, but I’ll still have my iPhone, until such time as a generic, open source, touchscreen smart phone market becomes a thing.

To be a success, an open source phone would require a published, documented, verifiable design that anyone can implement, improve upon, release, and that would be hard, because it sounds so anti-business, anti-profit, but if you think about how the original PC design became open, how everybody made money selling their version, and all the benefits for users due to cheap and open standard interface ports.

Of course, the phone would also need an open source, community managed operating system, all the basic apps, and an editable list of app stores trusted to download apps and updates from.

There exist Android emulators for Linux, and in fact, Android is itself an open source Linux based system, so perhaps it can somehow be figured out how to provide Android app store compatibility, such that some of the most popular user apps could be used at launch in an emulated manner, until such time as the developers can be convinced to port their app to our platform.

simple python http server

I haven’t worked on any technical projects at home lately, but found some time today to play with something I’ve been wanting to play with. I wrote a couple of simple Python scripts (below) that each implement a REST style API service.

The first is a 145 line Python script found here that creates a simple threaded http server that implements a custom REST style API to run a key/value storage system, for JSON encoded data.

The API consists of 4 combinations of URI and method (GET, POST, or DELETE). The URI has a fixed prefix that all calls must start with, /api/v1/.

Method URI Function
GET /api/v1/fetch/<key>
POST /api/v1/insert/<key> insert new record
POST /api/v1/update/<key> update existing record
DELETE /api/v1/delete/<key> Delete a record

$ curl
ERROR: Unsupported Service
$ curl
ERROR: recordid foo does not exist
$ curl -d '{"key1":"val1", "key2":2}' -H 'Content-Type: application/json' -X POST
INFO: record foo inserted
$ curl
{"key2":2, "key1":"val1"}
$ curl -d '{"key1":"val2, "key2":3}' -H 'Content-Type: application/json' -X POST
ERROR: bad json data - {"key1":"val2, "key2":3}
$ curl -d '{"key1":"val2", "key2":3}' -H 'Content-Type: application/json' -X POST
INFO: record bar inserted
$ curl
{"key2":3, "key1":"val2"}
$ curl -X DELETE
INFO: key foo deleted
$ curl
ERROR: recordid foo does not exist

The second is a much shorter and simpler server that provides a simple unauthenticated html file upload service to a private directory.

$ ls -l testfile.html
-rw-r--r-- 1 user group 93 Mar 17 15:47 testfile.html
$ curl --data-binary @testfile.html -H 'Content-Type: text/html'
INFO: 93 bytes uploaded to testfile.html

Winter is ending

Its the end of February, was over 70 degrees F the past two days here in Georgia, USA, which is perfect bike riding weather. Pollen has been spotted, so Spring is undoubtedly just around the corner. But Winter could still make a last minute stand, we’ll see.

Cheaper alternatives

My internet hosting service provider (not my ISP, the people who host this website and my email) sent me a bill for 2 years service for over $520. I cancelled my auto-renewal, because honestly, I can’t afford that much right now. $17/mo for email and a low traffic blog is just not feasible anymore.

My options include transferring my domain to a new hosting provider (even Hostmonster offers new customers a $5/mo choice), or setting up and managing my own servers. For the latter, I inquired about getting a static IP via my ISP, but found a /29 block of 8 static IP addresses will add $15/month to my service. Not much cheaper, and I’d have to manage and backup all my own servers. I have decided to move my domain to Amazon Web Service, for several reasons.

Mostly because AWS charges only for usage, I can host my services anywhere in the world, and my email and web users are VERY low traffic, so it’d probably be less than an average of $12/mo, but mostly because my employer is moving more towards AWS cloud services, and I need that kind of experience to make smarter decisions and build useful tools at work.

I was approached a few years back by representatives of the Whistl delivery company in the UK, who were interested in obtaining the domain. My knee-jerk reaction was to tell them “no way”, but if they were to approach me again now, I am thinking I’d probably let them have it for a reasonable price. I don’t really need, especially since I get so much spam as it is, and also because I own the domain, which I hardly use, because *.us domains are mostly spambots.

Browser Choice

Desktop browser choice is pretty much a user-defining thing. The average user just uses whatever browser came pre-installed. However, power users pick their browser, and install it, if it’s not the default. Most pick Google Chrome, others Firefox, yet others Opera. Those paranoid about privacy, choose the Tor browser (based on Firefox).

The Opera browser was purchased a while ago by a Chinese marketing company, so I can’t help but think they have an interest in collecting as much user behavior as possible. Chrome is made by Google, an American marketing company – same thing. Chrome suffers from the security vulnerability in that they refuse to verify if SSL certificates are revoked. Firefox has other weaknesses, but far fewer.

I left off Safari and Internet Explorer, because they are not cross platform browsers, and they are usually not worth considering, except to download a better browser. I’ve only heard about Microsoft’s newest browser, Edge, which I think is specific to Windows 10. It may be the fastest browser on that platform, it is the only browser they are going to allow on the Windows 10S low-end tablets and systems. Microsoft has itself turned into a huge information mining operation, they only need market and sell that anonymized data for untold profit.

Well, right now, on my Mac, I’m using Google Chrome to watch video streams on my second screen, because it works well, and Firefox doesn’t. There was something chipmonkey about the audio under Firefox. But I am using Firefox to browse, because it’s not created by a marketing company, and Firefox with Privacy Badger is a pretty safe browsing environment.

I’d love to hear your thoughts. Just tweet @whistl034

HDHomeRun and Plex make an inexpensive DVR

Last year, when we switched from Comcast to AT&T U-Verse, we already had a 4 tuner TiVO OTA and antenna in the bedroom, so we were fine there. For our office TV, I installed a TiVo mini I already had, and we were golden for TV and DVR.

Last week, the TiVo mini bit the dust, mid-program watching. Won’t boot, just a sad yellow ‘sorry boss’ led. I think it was $130 and it lasted four years, so I got my money’s worth.

Today, I setup an Ubuntu server running Plex Media Server software. I already had a free Plex account, so I was already able to run the Plex Media Server to stream my own media library. I can also run the free Plex Player client on my Mac. Their new DVR function, however, requires a Plex Pass, which costs money. You can buy one year for $40 or a lifetime pass for $120. Once purchased, the Plex Player client reveals additional functions, like setting up the DVR.

Plex gives away it’s two programs. The Plex Media Server is where you store your DVR recordings, so think of it as a DVR. You can have multiple DVRs, each controlling a different tuner, if you like. The second program is the Plex Media Player. They have clients for just about every device and OS, hard to find one they don’t cover, and you can just use the web interface. If you open up the port to the Internet, you can watch your video away from home too.

Oh, but where to get the TV signal and channel guide from? Last year, I bought an HDHomeRun two-tuner box from Amazon, for about $100, and an HDTV flat panel antenna for my office window for about $45. I decided I wasn’t a fan of the HDHomeRun Mac software and DVR, so I wasn’t using the HDHomeRun until now. The Plex automatically locates the HDHomeRun box, and let’s you setup the tv channel guide too. It presents the guide in a unique format that I’m not used to.

So, I’m giving it a shot. Have some recording setup, and it’s working fine. Video playback is smooth, not a whole lot of disk being used. Win!

AT&T Gets Light FCC Wrist Slap For Largest 911 Outage Ever

So, AT&T makes an “error” when updating 911 software which blocks 12,600 people from reaching 911 services, across 14 states, over an untold number of hours. But they get no fine. “Unacceptable” says Ajit Pai, the FCC chairman. “That’s really just too bad, somebody might have died, but it was nobody’s fault, so we’ll just ignore it.” implied the FCC chairman.

News: Last March, AT&T suffered a massive 911 outage that prevented customers across fourteen states from being able to call 911. While it was buried under tod

Source: AT&T Gets Light FCC Wrist Slap For Largest 911 Outage Ever