Category Archives: Personal Musings

What it says on the Tin

Musings…

I keep forgetting this blog is even here, moreover I’m impressed people actually read it. I know I have not kept up to date with comments and things on here, mostly because I didnt realise people read it!

So for 2019 I’m going to try and keep this up to date a bit more and interact with you guys.  We have a lot going on in the next few months including some big changes over here so it’ll be interesting to get the challenges up here in the open.

Although I don’t and wont share personal contact details here, you can find our company facebook page at https://www.facebook.com/realitytechportsmouth/

Au Revoir!

Here Be Badgers!

Right, here it is. The Unified Badger Theory…
 
You may have noticed that the only time you ever see badgers is by the road dead. This has lead us to beleive that the natural state of a badger is therefore dead. We know there ARE live badgers, the Gubberment tells us so, as do farmers. A few people has seen them and also noted that their ground state of mind appears to be pissed off.
 
So what does this mean? It means that for most badgers they experience negative senesscence. EG, a badger comes into being dead, it then becomes undead and regresses through to unbirth. This also hapilly explains why badgers are pissed off at everything. Being unborn sounds bad enough but the prospect of the female having to un-birth really sounds no fun.
 
So far we have identified the following two subspecies.
 
The Common Road Badger – These blink into existance along the side of many roads around the country. It would seem these then become undead within a few hours and amble off as they are rareley seen to remain in place.
 
The Lesser Smoking Kentish Rail Badger – As documented over at Notwork Fail. These will apear along the railway line and will often be smoking. As a possible result of friction during creation they may be found smoking ocasionally. They have also been known to attack Notwork Fail and BTP officers on approach for no reason.
 
Rumours persist about the possibility of a thrid subspecies known tentitiveley as the Hampsire Incandescent Road Badger. There have been limited sightings of this animal who seems to become undead with a level of incandescent fury rareley seen outside a household containing two or more toddlers. Upon undeath these animals will wildly and indescriminateley destroy any motor vehicle present. It is suspected the consumption of one or more Land-Rover tyres may halt the attack.
 
It is important to note, unlike sheep, badgers are not dragon resistant, although the fact they do apear to smoulder at the time of creation implies a level of fire proofing and high voltage insulation. We would reccomend againt the use of badgers on high voltage overhead transmission lines, this could result in the evolution of a new species of ‘Drop Badgers’.
 
Research is ongoing to acertain the possibility of using badgers in time travel devices and if hey would make a suitable, low power alternative to flux capacitors if suitably contained.
 
We would dicourace readers from attempting the installation instructions at http://strangehorizons.com/non-fiction/articles/installing-linux-on-a-dead-badger-users-notes/ in light of this discovery. Installation of the cyberspiritual controller at the moment of undeath may result in the badger chaing mindstate from pissed off, to mildly vexed or even worse, bloody furious. Do not trifle in the affairs of Badgers!

Torro Conference and moving forward…

Well some of you may know that I volunteered to stand up and talk about Touchdown, the Pod and SOA this afternoon. I never feel comfortable with this sort of thing, generally engineers are best kept as far awa from the public as is possible, putting one in front of a room of people really seems like a bad idea.

No one died, the message went over, I’m pretty sure it was well received and no one has arrested me…yet.

So the interesting thing here is that theres been a development. SOA was always intended to have UK and US versions. the UK version wasnt a priority, turns out this may have been a mistake.

Two messages seemed to come through today. One, that data is being lost. Too much reliance is being put on people’s memories, bits of paper and anecdotes. A large proportion of what Touchdown proposes centres around data retreival, integrity and storage and automating all of this as much as possible.

The other is that there is an apetite for something Like SOA here in the UK. With the move from STM32 to the Raspberry Pi the proposition that you could build your own and just use the Touchdown software is quite feasable. In fact we’ve had a large response to this idea.

SOA as it stands does provide a framework for doing a lot of this and with the move to Pi a lot of the special hardware isnt needed, making it modular is perfectly feasable and with the Pi, if you can use jump wires then its easy enought o get something running.

So where are we…

As the system stands a lot of the applications still need to be ported over, we’ve re-written a lot for Pi already so theres nothing horrid there, it can be done and its not a difficult job.
SOA was never designed to be ‘hackable’ so we need to make things as user friendly as possible, I’m thinking ‘appliance’ here so it can be plugged in and mucked with if needed. Although the Pi can run a monitor theres no need to actually do it if theres another computer to hand.

Sensor wise, its down to the end user. There are a set family of devices we use but adding support for others shouldnt be hard. Keeping it same and sensible is important so we’ll likeley pick just a few limited ones.

Reporting should be automated. With the user getting warnings and them being reported back to a server. Maybe an expansion on the reporting system planned for SOA so people can instantly log weather conditions that are out of the ordinary.

If its possible to maintaie a stable, syncronised time source through GPS or NTP then the possibility of lightning ranging and triangulation is opened up. It also allows for storm trends to be analysed as they happen, a potential powerful tool for tracking major storm systems.

On the back of it I think we will look into building a reporting system to use with this along with a repository for reporting severe weather events and ways of automating certain reports and alerts. On top of this a simple to use website to allow everyone access to the data.

Looking forward the easiest way would be to offer either the pre-built units or instructions, or even both. As all Pi’s are essentially the same a single SD card image can be used with all the necesary software, setup and modifications already done.

Its a very interesting proposition and one I think will be getting looked into over the next few months.

 

 

Security Through Obscurity…

Time and time again we are told the above doesnt work and yet it still happens, why and why is it important.

In the process of our big tracker project I’ve been looking at packages that do the job already and also a number of control systems. Bearing in mins what these do, there is no security on most of it and token security on one of them. You may think that this isnt a big thing and in most cases it isnt, but there are good reasons it is. So lets take a look at two culprits…

Veichle control Systems:
I’ve looked at three of these in the last few days and one thing is common with all three. No encryption, no ability to detect errors and no authentication. For the best part this isnt a huge problem, in fact its a boon, I know how all three work and can talk to them. However the bar is set low and any of them can be subverted easilly. In the case of the first two I looked out you wouldnt gain a lot past the ability to turn things on and off but the potential is there. The third is a fully integrated system and is designed to control all aspects of the vehicle including battery management and its also designed to be linked to an onboard computer, now here there is scope for trouble. The system concerned could easilly be subverted by malware and then the fun begins. Further digging on this one turns up no encryption is used on the comms link to the office either, but it gets worse…

Of the dispatch and tracking systems looked at not one uses encryption, for the more basic ones this means simply you can rap the tracking data out of the air and at least one was fooled easilly to connect to a wifi network and forwd the unencrypted traffic over my network, patient details, addresses etc all in the clear. This means that this data can also be modified so this particular system couldbe subverted pretty easilly with a laptop and either data creamed off and used for nefarious means (ask someone how expensive ambulance kit is!) or worse. Tracking data can be falsified, fed back to the base meaning the veichle doesnt show as being where it belongs and uh-oh, the SOS button isnt going to help if someone jumps the ‘bus’

There is no excuse for this. None at all and the scary thing is that the same can be found in Automotive systems from the factory and has recently been found in Avionics systems too!

Serial channels can simply and effectively be secured. Even our most basic bus system, TRNet uses authentication and device tracking mechanisms. If a device doesnt authenticate, its ignored, even if it does it must be recognised on the bus else the system, will lock the bus out. This is done by a few dozen lines of code and although it could be subverted with a moderate amoun of effort you’d need access to a lot of information to get started. This also allows us to give a mechanism for making sure the bus is reliable. Scarilly all but one of the three systems we looked at just kept sending the same data over and over again with no control path back to the master. Although it’ll work it means bit flips can happen in transmission and may be ignored (no CRC/Checksum used) and there is a wealth of data for someone to grab and decode.

Data channels running over a network should also be secured and only links trusted should be used. Again this is basic common sense and doesnt take a lot of code, for this application we are using a pretty hardcore authentication system based on AES but its neither difficult to do or expensive. With the system we have both ends of the chain must be compromised and in such a way that pe-encrypted data is grabbed, key shifts mean that just grabbing the encrypted stream and a session key wont help you, even if you do grab a key you need the previous key to use it so good luck.

There is still a strong sense in the embedded and bespoke software/hardware market that no one is interested in specialist applications, wont try and thus you dont need to bother about security above making it convoluted. This is fallacy, the value of the equipment on an ambulance is truly amazing to those not in the know so gaining the ability to send such a veichle to a false location of your choice, disabling its tracking  and ability to call for help is going to be of interest.

 

A Precautionary Word About Driver Updater Utilities

I see and remove a lot of these and they are almost universally hated by the IT industry and a VERY bad idea for the end user, but why?

First up the Malware side of things. A lot of these updaters are Malware/Foistware. The same trick is used as is pulled with a lot of ‘Heatlh’ checkers. Find lots of problems, tell the mark you can fix them and then charge for doing so, even though the problems likeley didnt exist or wernt problems at all. There are driver programs that do just this.

RansomWare, well not quite but thats what it tries to do. Does the same as above and likley will actually find issues and be able to update drivers but once again, you’ll have to fork out for something you could do yourself and may be harmful.

Next up apps that do actually do something. Most of these still fall into the dangerous category, even those that are free and do seem to work. Why? Because the method they use is fundamentally broken. Almost all hardware provides two bits of information to the PC without being told or needing drivers, this is the PID and VID or Product ID and Vendor ID. These are used by these programs to go and find out who made the hardware and what it is. It’ll then grab a generic driver from the manufacturer or its library  and off it goes. No harm done? Well not really no.

For a large number of devices this is just find and dandy but some developers are a bit lazy and this is where it falls over.  The idea was that each PID/VID combination would be unique, the IDs are provided by the chips used and the idea is that the company making the perhipheral should assign their own values and not use the defaults. In reality very few do.

Two Examples. One, we have a Laptop using an ATI X1300 GPU, Its in a laptop and this GPU provides some secondary functions, in this case, it can take control of the backlight. The Laptop manufacturer hasnt changed the PID or VID and this wasnt something the ATI drivers were written for. Along comes the utility, matches the PID and VID to the default ATI driver, installs and blammo. You have no backlight anymore, the new drivers dont know how to drive the backlight and so dont.

Next we have the RealTek ALC892, fairly common sound chip. However this thing has options out of the wazoo. There are hundreds of combinations of speakers, inputs, outputs, mics etc and these will all be changed by the vendor, however not all of them are there in the default driver, in particulat there are a few general purpose I/O pins used to drive LED’s, power control for amps and more. SO with the driver the manufacturer of the hardware supplies this all works, in comes the default driver and once again, all these features are gone.

Both these scenarios could lead to physical damage to the machine as well as malfunctions in use so it really not worth it.

With time and research you can normally find the right drivers and its also why you should stick to manufacturers drivers, especially on customised hardware such as Laptops or SFF PC’s. Its also why sometimes the right drivers dont really seem to work as planned. Its also the reason the Microsoft drivers arent always that great and why you should always update core drivers on a new install, an example of these were the older VIA based motherboards that would run like a dog with the MS drivers. Only if you’ve tried everything should you think about these programs, and even then, here be dragons!

Test, test test and TEST

After doing some work and applying the PPP changes from previous posts I was rewarded with nice, rock solid PPP connectivity, excellent. I still neet to take timeouts down and work out another way of monitoring link quality but for all purposes it works.

However something that did work, posting tracking reports, now doesnt. One report is posted with the link establishment and then it all breaks. A look at logs shows that the tracker reconnection code has stopped working.

After a few tweaks and changes I really couldnt work out what was going on, then it hit me.

The PPP code fires off two events, LanDown and LanUp, these are used to decide if the GPS module should try and bring the link up again or if it should sit tight and wait. In wholesale re-writing the PPP control code I forgot to fire either event, DOH!

Always test, test, test and then test again when changing things that sit in a chain!

The PC Market, Going forward in 2015

I was asked earlier where I thought things were going with the home PC market. With the announcement of free Windows 10 Upgrades for many it’s obvious that now even the big players think things are about to change so heres my thoughts. Jump in and belt up…

7457397_3_l

At Home:

The landscape of home computing has changed in the last couple of years and still is changing. Cheap, powerful portable devices are becoming ubiquitous and are replacing the humble PC and Laptop and this is changing everything. Desktop sales to domestic users have been slowing down quite substantially for a number of years, this isnt news, most sales now tend to be to people that dont want the hardware lock in of a laptop. This really means Gamers, Tinkerers and those with specific needs for a tower, most home users are now on laptops.

Smartphones and tablets are increasingly replacing these laptops in day to day use. Why fire up a laptop when you already have a booted phone, these devices are getting more and more powerful and integrated with other aspects of our lives and are starting to push the laptops out now as well. Web browsing, doing email and online banking is pretty much it for a lot of the machines I support and all but the most basic tablets can cope with this

On top of this its all gettting cheaper. I was asked about repairing a £90 tablet the other day, after hunting the parts were found at £55 and then as the thing is glued together, I need to allow for a good hour of work. At this point its not worth doing so in the bin it goes. With laptops falling rapidly (I didnt say GOOD laptops) we are heading into disposable territory and to be honest the design of the newermachines dont lend themselves to repair anyhow as we are returning to the days of proprietory non-upgradable hardware which is somewhat ironic.

So where is this going? Well for the dozens of small PC vendors around its not good. The box shifters are doomed as they stand. Of the 4 PC vendors in my imediate area I dont expect any to survive based on the above. All 4 will find themselves fighting harder and harder for a smaller market share and with exclusivity deals gong on with the big players for tablet hardware and phones its not looking great.

But there’s still Repairs right?
For now, however we’ve seen this happen before. Rewind to the 80s and 90s. A TV or VCR was a costly investment back then and a very healthy economy existed of repair shops, authorised dealers and A/V shops (This i where I started out). As the trend for cost reduction kicked into gear at the end of the 90s and start of 2000, these items started to get cheap. All of a sudden it wasnt worth spending £100s on having your VCR cleaned of the jam sarnies your sprog had fed it and the second genearation tech was already on notice of obsolecence anyway. In the space of just a few years almost all of these shops vanished, those left hanging on for dear life fighting over the few loyal customers that remained. Then, about 5 years later prices crashed to the stage we are at now where that loveley 42″ screen you are sat in front of and that cost you £400 actually needs a £600 panel if you break it, its not worth it and most electronics goes in the bin. PC’s are going this way now.  Take something like the godawful HP Stream thats all over the TV. You can get this thing for £179, for whats in there thats impressive. Now, you buy one for Billy and as children do, he breaks it. SO in you come and you are informed that the screen will be about £45/50 and the labour about the same again (thats actually a pretty cheap price) and hang on, thats the 50% rule broken and you may as well replace it. Your insurance co wont repair it and god knows what else he’s done to it.  So you see, we are well on our way already. Once this reaches its logical conclusion there wont be much point in doing the tablet repairs either, to make this all viable as a business you’d have to do so many at such a low cost you’d end up making pence at a time and would have to run so close to overload its just not worth it.

From the domestic point of view I think our days as small PC vendors to home users are numbered, maybe not in 2015, but soon.

Business

This is a slightly better story. In short things arent going to change here that quickly. Businesses like continuity and often spend a lot of money on keeping this so. Business desktop sales are actually picking up as are server sales but its sadly not all good news.

Businesses are looking increasingly to save more money and IT always takes the hit, and in this case its set itself up for it. With increasing convergence of technologies that were once totally seperate such as telephony, paper filing etc, simply dropping a server in and going home isnt going to cut it anymore. Businesses are looking more and more at tying everything together and it seems that among the SME IT vendors the skills to do so arent commonplace and contracts are being lost because of this. It used to be that if you didnt know how to do something you’d be able to find a way round it or as was done by some businesses, simply say its too expensive or cant be done, this didn’t wash at the time and certainly wont now.

Many businesses are now whole heartedly embracing technologies such as cloud computing, roaming profiles, metwork storage in way they wern’t a few years ago. The knock on from this is that the desktop machine is becomming more and more of a thin client, now for some of you this will be familiar. A lot of people went down this route early 90s and before in the shape of X-Terminals. Although they worked, the PC’s overtook them pretty quickly although this didnt stop Larry Ellison for Oracle saying we’d all be using thin clients in the future. He took a lot of flack for that, and it turns out, he was probobly right. With the massive increase in computing power and network performance over when we first went this route its almost becoming a given. A PC in an enterprise network will probobly log into a domain, it’ll do nothing more taxing than office or web browsing in most cases and it’ll be working with al its files and user data stored on a server. This is nice from a support point of view and backup point of view and to be honest there arent many good reasons for it to be done another way. The upshot is, all thats doing any work is the processor, and if we virtualise that machine you can have a slice of a much more powerful CPU on a server. and thats where we are headed. This is bad news for the PC because of the current crop of thin clients, only a handful could be called PC’s and the rest are normally ARM or MIPS based, closer to the Tablets and Smart TV’s than a PC. When a thin client goes, you throw it out. Theres no software to toubleshoot just the hardware.

meteorite-dino

So where does this leave us, well lets draw some conclusions:

The PC landscape is changing, this change is acceleration and will drive the demise of the PC as we know it now, this *is* happening now and would almost certainly form part of Microsoft’s decision on Windows 10. PC Desktops have the same fate as our friend above, he’s dodged the bullet, but it’s still going to get him. Laptops wont be far behind. Its worth noting though that the main opposition to PC’s in the form of Tablets and the ARM ecosystem offers nothing that can displace the hardcore games, though it is coming. Ultimately Win10 will continue the march towards SAS or Software as a service and for that goal to be acheived Microsoft need to break the connection between Windows and the PC

In business the change is still there but its slower, there is always going to be some requrement for that little bit more power than a thin client offers and PCs will hold on there for a while. The amount of investment will draw out the death of the business PC a little longer BUT bear in mind I can pick out all of my larger corporate customers and have them on thin clients without any apreciable loss of productivity in a few days, the change may be faster than in the domestic market.

Its interesting to note that there has been little technical drive from intel of late, most changes have been incremental, the same could be said of AMD, almost as if Chipzilla and AMD are in a holding pattern. AMD seem to be hedging their bets, not only are they headed down ther APU route and the tight coupling of GPU and CPU lends itself quite well to a change of CPU core but they are also ouring a lot of money and development into seriously fast ARM chips: http://www.amd.com/en-us/press-releases/Pages/64-bit-developer-kit-2014jul30.aspx There is also very little speculation or indeed any information on what Windows may become after 10 or indeed if there will be anything, the money maker here now is Office, MS made that abundantly clear.

So, the box shifters that are still out there need to shape up and change focs, be aware that the golden goose is about to buy the farm. Its time to lead and innovate and find better ways to engage and serve your customers. With lock in deals with larger ventors the playing field isnt very friendly so now more than ever you need to make yourself stand out and move away from relying on PC sales and repair as a core business activity. Find a niche or at the very least plan an exit strategy.

I’m not saying that the sky is falling here, we all know it is going to and its going to soon. I’m saying start planning and be ready.

 

 

Linus VS Win Server TCO – The reality

While talking with a Collegue yesterday the issue of their server came up. The IT guy has, and I cant fault the logic, gone with Linux and Samba to power what is essentially a Windows Network. This would be an ideal situation except for a small number of flies in the ointment. At the most basic Level, drivers, are the first hurdle. Te Windows drivers include support for all manner of offload engines, acceleration technologies etc. The Linux drivers mostly lag way behind the Windows ones, but they are out there as a rule. Just to get the bare server up and running to snuff can take a while. So starting from a bare server we have a TCO at this point of £0 for Linux and £450 for Server 2012. Now if we run on from here assuming you just install and go you’ll already be having problems. You’ll be seeing performance hits on Network and RAID hardware, easilly fixed with drivers however we are now talking a good hour or two fiddling and faffing with the Manufacturer Linux drivers (if they exist) Bar just running a CD under Server 2012. Say your time is worth £50 an hour and the drivers arent quite right and thus a build from scratch is needed, two hours, we stand at £100/£450. From experienc 4/5 hours is closer to reality as there will be missing things you dont find. On top of that we now have to get Samaba working on this server, Domain config with Samaba isnt fun and the documentation is poor,  we can easilly have blown a day by now so thats 8 hours or £400 vs £450 TCO on the software side. Now heres the nasty bit. That Linux server could EASILLY take you up to a day to get working to a useable extent. The Windows server, from bare metal to running a domain, 2 hours tops. being far we will allow for that two hours so £400/£550. SO excellent you saved £100, probobly shaved a few days off your life too. Only its not done. As a rule AD and Windows *just* works at the basic level. You’ll now be looking at 5 mins per machine to connect the PCs.  Samaba, prepare for permission and authentication hell, possibly on a per PC basis. In short unless you are a Samba God you wont get it right first time and file permissions are the usual cause.  In short, the Windwos Side could be finished in 4/5 hours total from start to users loging in with Roaming profiles and you could Probobly have exchange going too in that time. With Samba you may end the day at ‘Almost works’ and I’ll fix the niggles tomorrow. This is a ‘bit of string’ situation but already there is a disparity here. The Windows server is done and working. I’ve gone home and am Reasonably happy about a productive day, the Samba installer is probobly not having such a good day.

Now lets look at he basics for what we might want to do with this server.

A small office I’m probobly going to want IIS for various tasks (It failry central to 2012 so not having it isnt an option TBH) SO I have a web server and a scripting engine out of the box. I have a pretty useable and simple to use user administrator and unless I install Server Core (and even then not neceserilly) I can manage users, groups, email routing, web setup, replication and indeed about 90% of basic office tasks and be done by the end of that first day.

Linux, well we’ve fixed Samba. I now have to Install Apache, almost certainly some flavour of PHP, probobly MySQL (We do on Windows anyway) and then start installing admin interfaces. Now most Distros do come with various bits and bobs there already, however you’ll find apps start asking for newer PHP versions, or extensions which need newewer (or older) versions, it can get knotty fast.

Its not unfair to say that the windows server would cost you about £800 and be done on day one. (I’m not factoring in Exchange cost, the reason being getting the same functionality for Linux is hell/impossible)

Linux wise we are at two days, so again £800 and will have a reduced level of functionality. Now theres another big trick there many miss. That extra day? Well theres your £400 but if this is a live or new setup you could be having to factor user downtime in there too so lets look at that.

Users cost less than consultants, how much yours cost is down to you but if we assume the following. You average user is on £25K There are about 250 working days per year, so thats anice easy £100 a day; for an 8 hour day thats  £12.50 and hour. Think of a small office of 10 users (Yes I’m keeping the maths simple) so thats costing you £125 an hour to pay them, now its about to go wrong…

On Monday you’ve scheduled a server upgrade, you are doing it yourself and lets say its a complete refresh and start over. Your users have been told no PC use (its a little contrived but it DOES happen in small business). So for Windows we were down for a day, thats £1000 in lost time. You know where this is going so we have £2000 on the other time. Its interesting to note that the extra £1000 would purchase a low end Windows server!

Its not hard to see the Linux Admin over a day behind. If thats so its all gone wrong now as the free server has cost heading on for twice the paid one.

Its not all gravy though, troubleshooting MS products can be an issue, debugging and logging with MS products can be lousy and that can slow down finding some issues. There is a very twisty mindset required sometimes, that said. MS will help you if you call (within defined limits)

Its all up and running and now we need to think about day to day use. Adding a new printer and sharing it can be done quite feasably in a few minutes on a Windows server and this includes making sure users get to it with no work from them, dirver issues etc. The whole thing can certainly be comfortably wrapped up pretty fast. Linux offers no such mechanism and indeed its even possible that there are no native drivers for you printer, with a USB device this could mean a lot of work.

So to sum up, think long an hard about what you want to do. If you have plenty of time and it has not associated value (You need to look at the way you use your time!), want to use older hardware, maybe only want the absolute basics of a Windows network  and have low blood pressure then go for it. There are situations it is exactly the right tool. If it *has* to work then it may be significntly cheaper to go Windows.

Now to stop the flames, we normally deploy both. In fact we often use Samba for slave/backup operations and most of our sites do have at least one dedicated Linux system. Some things *do* work much better, often the server will be providing a web backend or database services to Windows (Access can use MySQL). Right tools for the right job. I also know I’ve simplified a LOT in here, but the numbers do stack up if you sit down and work your project out on paper you’ll see similar results, at best the Linux option may well cost you as much and deployment methodology can make a big difference but its unlikeley the TCOs will ever be better than eaqal and the £0 TCO of Linux is a myth for business. Businesses rely on saving money as a rule, if Linux were a magic bullet Windows would have no market.