Architecting an IoT Ingestion Platform with AWSOne developer's journey...
I’ve spent the last several years working with data and analytics on some interesting datasets with varying degrees and types of requirements. But none of them required near real-time responses factored into the IoT platform’s architecture. That’s just one of the many reasons I joined THINaër! The fact that data is collected multiple times per second and shipped somewhere and needs to end up in a user’s hands not long after it was first captured reminded me of how much fun I had in a similar role, processing movie ticket and concession sales as my group transformed a mid-sized theatre company. Combine these facts with the management team and the simple truth that I love building things and THINaër is the perfect place for me.
THINaër is an IoT company that works to be The Internet of Tagged ThingsSM, which means we’re hardware agnostic and not bound by quarterly device sales or quotas like so many other companies in the space. What we’re really focused on is data and its many uses in solving interesting problems. Our customers expect up-to-the-minute and accurate tracking of assets and additional peripheral information that could be associated with the device/beacon that is attached to that asset. I don’t make the distinction between people, places and things because in all truth an asset can and often will be all of those. However, we provide that level of abstraction which perhaps is the topic of another post.
The approach I’m going to describe isn’t the only approach and I’m not endorsing one cloud provider over another. But for the sake of details, we are currently using AWS and a host of their services. Coming from an almost exclusively Microsoft’s Azure background, I have been plenty happy learning the new platform.
As a C# guy who has been in MSFT tech for nearly the last decade, it was a shift to ramp up on the new technology but being small and having some in-house expertise lessened the learning curve. And if you know me personally, you know I’m up for a challenge and immersed in the learning.
There is some IP in here that I won’t discuss in detail but we essentially choose candidate messages to be formatted and sent down the line to make their way into final storage. Once those records are selected they are immediately posted into AWS’s Kinsesis Data Streams. Kinesis Data Streams is literally what it sounds like: a stream of data which is broken up and managed via shards. Data can be read almost instantly from the moment it is written, and it’s designed to handle batches of records per second. Large batches of records.
The downside to Kinesis? You’re limited to 5 reads/second with a maximum of 2MB/second per shard, which means that to handle downstream loads you might need to bump up the number of shards. And as of this writing, there is no auto-scaling feature; you kind of have to do the math and play with it to get the right balance.
Lambdas, Lambdas, Lambdas
I know there is a lot of talk right now about serverless IoT platform architecture so I’ll add my $.02. It’s just like anything else related to technology: if you have a problem, chances are there are already certain tools that exist to solve it. But not every tool solves every job, so pick the right one.
When you get 25 years or so into your passion, you build up quite the bank of experience to help you make these types of decisions. For us, they make sense and we use them pretty heavily to process specific logic and data persistence.
It all starts with a primary reader from Kinesis with a fan-out into a number of sub-functions, all of which are written in Node.js.
- Pay attention to synchronous vs asynchronous
- State is not guaranteed so watch your database connection/resource management
- Keep your functions small and purposeful
You can imagine that with any real volume of data there is no one-size-fits-all solution for data in motion and data at rest. To combat that, we use a mixture of solutions:
- Data in motion: Redis. I love Redis and we put a nice-sized cluster to good use!
- Data at Rest:
- MongoDB – I’ve used this before on a smaller scale but can’t say enough about how much I love working with this system. I actually love the query language, the way you can use pipelines to build up aggregated queries. And working with the Mongoose library, a Node.js Object Document Mapper (ODM), has been a delight!
- Elasticsearch – When you’ve got lots of data and you need to search it, there are a few products out there that’ll do the job. I’ve successfully used this one a few times and I’m happy to report I’m using it again.
- MySQL – Reporting and rolled-up data. Most of the reporting work can happen inside of Mongo but we do shuttle a good bit of data into a relational/denormalized form for ease of querying. In the future I could see us looking into Redshift as the shear size, speed and management will require us to make a change.
- S3 – Logs and logs and logs, oh my!
I can’t say again how much fun it has been to be a part of THINaër.
I plan to contribute future posts as our platform evolves and we begin to engage the developer community. Because we have a very robust RESTful API that is completely accessible to partner developers. It covers all of the raw movement, location, peripheral and device details and in the near future will contain all the assets-related metadata that I mentioned before. As that continues to be more further refined, I hope to share those details.
If you have a problem that requires speed, size and volume of real-time or near real-time data, chances are the major cloud vendors have similar tools. I make no claims that this is the only approach. But it’s what works for us right now. And in the future, who knows?
And therein lies the fun!
How do you track your high value assets?
When a single piece of medical equipment costs $50 thousand, knowing where it is and when it’s being used is critical to optimizing equipment utilization.
If you could solve even one problem today, what would it be?
THINaër is helping organizations save money and increase revenue with IoT solutions that can be deployed today and enhanced tomorrow.