We hear a lot said about 'The Internet of Things', mostly in the context of the 'Smart Home', for which you can already buy expensive gizmos that let you turn on your lights or your heating from your phone rather than walking across the room to flip a switch. This in some ways is a shame, as doing the job manually provides what is probably much-needed exercise, and may also help avoid just a little bit of repetitive strain injury from constantly thumbing a phone screen. My daughter has a robot vacuum cleaner that she can set cleaning the house when she's on holiday in an entirely different part of the world, again by using her phone, but we're told that the real revolution will happen when household appliances communicate and then use Artificial Intelligence to try to anticipate our needs. The oft-mentioned example is the fridge or freezer that will order food in for us when supplies get short, and once the devices start to communicate effectively the fridge will be able to take a look in the freezer to see what we already have before it sets about doing our shopping for us.
We're told that the real revolution will happen when household appliances communicate and then use AI to try to anticipate our needs.
I guess the next step is to install one of those intelligent toilets that have been available in Japan for some time now — the ones that analyse the 'proceeds' before displaying dietary advice. Link one of those to your fridge or freezer and it will be ordering sprouts and broccoli in place of burgers before you know it. And don't even think about letting the bathroom scales talk to the fridge!
All well and good, I suppose, but what does this mean for our home studios? Is it only a matter of time before some embedded AI decides that our singing or playing isn't up to scratch and orders us a course of lessons? Or how a about a difficult mix session where your DAW starts buying new plug-ins on your behalf? Perhaps a guitar that can tell when the strings are worn out or rusty and can order new ones for you?
There are already examples of software based on machine learning trying to write hit songs, so it is only a matter of time before we start asking an AI to listen to our compositions to tell us whether what we have is any good or not. Maybe they can also listen to the stuff that other AIs have written and mark it out of 10? That way we'd have more free time to go to the supermarket to buy the food we actually want rather than what the fridge thinks is good for us. And as for phones, whenever the salesman asks me what features I want on my new phone, I say I just want to be able to enter a unique number that allows me to speak to somebody who is in an entirely different location. A novel concept, I know, but who ever went to the doctor with RSI of the tongue?