It's likely your data file is not in the optimal format for data analysis. That's okay, it might be optimal for other purposes, like ensuring data validity, writing clean records with flexible columns (JSONL), or taking up less space (CSV).
Sometimes you need more flexibility in recording extra fields for some data points and not others, or receive your data from an API that only returns a list of JSON Records with one per line, that's where .jsonl
comes in.
Unfortunately, only a few things (usually paid) support directly importing JSONL files for data analysis, and WipperSnapper's offline mode writes to JSONL. Often it's best to convert the JSON records into a new format purely for data analysis. That might be .csv
or .xlsx
or some database format, just bear in mind that most formats have various limitations.
.csv
files are great in theory, but as a "file format," it has disadvantages, mainly that it took shape in the smelting pot of industrial insanity, otherwise known as undefined behaviour or, more specifically, incompatible non-standardised usage (even before the internet).
These days, we have a bit more of a consensus since most companies want compatibility with other languages and systems.
The UK government has a guide on how it believes best compatibility can be achieved (mainly, don't use them unless appropriate). Take a look if you are interested, or know that ideally it should be UTF-8 encoded, but you can assume at minimum each value is separated by the field separator symbol (comma). Each row is separated by a line break (new line per row), and the file can optionally have a header row (recommended). That's about it.Â
Fortunately, .csv is so common that all spreadsheet and data analysis software can import it, assuming your data is "clean" (compatible and without errors). Often, after import, you still need to set data types for each field.
Similarly .tsv
files (TSV = tab-separated values) use Tab (â²) to separate each field instead of a comma.
Export the Logging Files
Let's convert the offline data logger data from its native JSONL format to CSV for maximum compatibility.
First, copy the .log
files from the SD card to your computer, it's usually safer and quicker to work with, rather than trying to repeatedly read and write them directly on the SD card.
It's good to be organised at this point. Have a folder structure for your projects and/or data files, and if necessary, rename the log files to help this goal further. I usually do this with my config and secrets files too.
Using an Online Converter for JSONL to CSV
Using an online service is often quickest, unless you already have a spreadsheet package installed on your computer, in which case use that instead. We'll show both, but start with this simple service, which is free for converting files up to 5 MB, no account needed:
https://konbert.com/convert/jsonl/to/csv
Simply select your .log
file and it will upload and interpret as JSONL, then show you a preview of the data. If you use the converter via a different page or for other file formats you may find it won't accept a .log file as the file extension is unrecognised. Using the JSONL to CSV link directly does work!Â
Click the Convert button, which you'll find if you scroll down to the output section beneath the table of data.
A Download button appears with the file size indicated, click to receive your CSV file.
There is also the Studio button where you can explore your data, filter and even ask AI to (try to) make a graph.
With our converted file in hand, that's it for this stage! Now we can move on to exploring the data in one of the next guide pages.
For an alternative solution to this online converter, we'll talk later about importing the JSONL data directly into a Jupyter Notebook, and using some Python to manipulate and evaluate the data (code provided for you, it'll be easy).
There's also a nice hacky offline Python script I came up when I had a very large data file (36MB) as a first test, along with a column for the component names added into the logs, but it comes with the following disclaimer:
BETA Territory:
WipperSnapper offline mode is an early experiment, so the data format may change and become more helpful in the future. For now, the priority is minimising space used on the SD card, so to add back extra useful information, there is a loosely maintained Python script to create extra columns for the component names, by using the config.json if available, along with converting the data to .csv
or .xlsx
(Excel format).
The basic usage instructions follow, but for more information please see the GitHub repository: https://github.com/tyeth/Adafruit-Wippersnapper-offline-mode-JSONL-data-converter
Ideally, clone the repository or download as a zip and extract the files. Then install the requirements with the normal Python package manager pip: pip install -r requirements.txt
Then you can run the script from the folder where you extracted/cloned it. First provide the .log files, and optionally the config.json and wipper_boot_log.txt to fill in the extra information automatically.
For example to merge all .log files from any subfolders (recursively) into one .csv, the command would be:
python ./json_to_xlsx.py -r --merged --csv
  [The script defaults to generating Excel files]
For more information, please check this section of the file which lists the optional arguments:
Â
Alternatively to avoid installing anything you may wish to try, the Jupyter Notebook page of this guide will show you some online platforms that both convert the data and use Python code for charts.
Page last edited April 02, 2025
Text editor powered by tinymce.