Just about every IoT service provides stable long-term storage for the data your things are producing. Why is that?
Take this simple IoT weather station as an example:
Say this is measuring the temperature once per second and storing that as a single byte of data, It will produce over 3 kilobytes of data every hour. Over a year, that will grow to over 30 megabytes!
Sure, 30MB doesn’t sound like much from the perspective of a laptop or workstation, but for small power efficient embedded hardware, such as one of Adafruit ESP32 or SAMD-based Feather devices, which measure storage space in kilobytes, 30MB is a very big deal.
If this weather station is also producing data for humidity, barometric pressure, and wind speed, it’s easy to see that we’re going to need to store this data off-device. Even if you have a high powered single board Linux computer, with 8 Gigs of storage, you’re one bit away from a corrupted filesystem, with all the data lost. If you imagine having tens, hundreds, or thousands of these devices, all producing weather data from different geographic locations, it would be very convenient to have all of this data automatically collected together in one, backed-up, place.
Don't forget, your devices are connected to the internet! So there’s no need to try and store all the data on the device itself. Instead, your Thing will use services to store the data they produce at the very moment they produce it. Now that service is responsible for storing this raw data, typically in a time-stamped database. They’ll also provide a way to access the data, either in the form of a user interface, and/or as an API which other Things or apps can use.