Saturday 11 January 2014

From the get go and what changed

Hi all,

it has been a long time since my last post. A lot had happened and my interests shifted quite a bit as well. Firstly, I am not contributing with free software actively, like in the good and fun days tinkering with Amarok about a year and a half ago. The point is, desktop development, in general, does not makes me tick anymore. Especially when that involves front-end development. KDE is a great community and the knowledge I had acquired by contributing, for two years, certainly surpasses ordinary desktop development assumptions. It certainly entails good software development practices, agile methodologies, build systems, tests, revision control, community building, software licensing and the like.

But then things changed. I got an internship at The Networking and Telecommunications Research Group (GPRT) to give some dynamism to my undergraduation. My desire was to find something more closer to the bare metal. And that was what I found. The opportunity to toy with a new and challenging area, commonly called the Internet of Things (IoT). Or, as IBM like to puts it, the Internet of Everything. Actually, the Internet of Things is, as every buzzword, drawing a lot of attention from tech guys and gals around the world. With new startups appearing here and there with the promise to solve the several obstacles hindering IoT's envisioned future to become a reality.

Coding for wireless sensor network (WSN) nodes with specs such as 16Mhz - 16 bits Texas Instruments' MSP430 MCUs, 8KB of RAM and 92KB of ROM, among other tasks were part of my daily routine at GPRT. And believe me, embedded device programming can be very challenging. It was not uncommon, at the very beginning of my internship, to find out that, after compiling, your program did not fit in the scarce node's ROM. Several months revolving around WSNs, routing protocols, addressability, wireless interference, energy usage and the whole new set of problems that appears when you are dealing with such wireless low-power devices. Zigbee, IEEE 802.15.4, Contiki, TinyOS are some of the commonplaces when dealing with this kind of technology.

Afterwards, my focus changed to developing middlewares to coordinate those wireless resources-limited devices. Leveraging on ZeroMQ, that amazing free software socket library that acts as a concurrency framework, it was pretty easy to develop a distributed middleware that could scale horizontally. Because everybody knows, scaling vertically is too old school. It is much better to just buy commodity hardware, like Google and every sane company dealing with high processing requirements, instead of wasting tons of dollars to put all your highly expensive servers on steroids.

Scalable systems are a must for IoT, as every WSN node has to somehow report their sensor's measurements. Usually in a timely manner. There are alternatives to that, but I will not enter into the merits of them. To give a feel of envisioned WSN proportions, the academic literature mentions numbers close to three thousand or more nodes in a single network. And standards like IPV6 for Low-Power and Lossy-Networks (6LowPan) and Rooting over Low-Power and Lossy-Networks (RPL), where developed to address these scenarios. Obviously, the field is recent enough for me to doubt that anyone uses WSNs that big in production. Curiously though, one of our industry partners has a network of similar proportions, but it still uses wires. Have you ever wondered how hard is to manage all that wires as the shop floor is reassembled. Their case, however, is very specific. Since shop floors tend to suffer from severe interference in the wireless spectrum, mainly due to soldering and the presence of metal everywhere. Despite this kind of harsh environment, IoT adoption is most a matter of how cheap are those nodes and how we overcome the existent technical difficulties o the wireless world. Cause, in the end of the day, industry traction around new technologies is just a consequence of their economic effectiveness.

And believe me, tons of data flowing can be really precious when decision making is at stake. Curiously, decision making is actually the biggest paradox limiting IoT adoption by ordinary people. While data can help a professional to make a decision based on available data about a phenomenon, being that common in several disciplines like engineering, economics, administration and so on, that is not actually what most people expect from a home automation system for instance. Technology already provided us with too many gadgets to tame and I suspect that no one wants more.

Companies like SmartThings and NinjaBlocks, which are in the home automation niche market, face this kind of problem. Requiring users some input in an "if this than that" format, in order to adjust the system for the end users needs. Therefore, end users have to know in advance what it would be interesting for the system to do if something happens. Should the lights be turned off if nobody is inside the house? Should this? Should that even if x happens? Managing all that can be pretty tedious and it is, in my humble opinion, the biggest challenge that these IoT era home automation systems have to copy with, in order to really succeed in the long run.

Despite that, I have been also exploring the realms of Big Data. From NoSQL graph-based databases (Titan for the win o/) to map-reduce architectures for our new research project at the internship.

In the end, my next post will cover details about my personal project to analyse the data from the Brazilians' 2012 Municipal Election with Python, that initiative aims to provide me the means to master the art of data analysis. And, I hope that it can be somehow informative to both people interested in the data analysis process and Brazilians politics.

No comments:

Post a Comment