Json to MySql
Convert large JSON files into MySql. Uses fgets()
& parses json line-by-line to minimize memory usage. Also works for smaller json. Also chunks the output sql
files to your desired size.
Example
JSON must be in this format:
@file(test/input/main.json)
Executing it:
<?php
@import(Example.Full)
Install
@template(php/composer_install)
Versions (branches)
- v1.0: old version with a disappointing api
- v2.0: new version with a much cleaner api
Features
- Read small or large JSON source file and output mysql create statement & insert statements
- Chunk insert statements, limiting max length of output files
- Output column data from all rows of json file
- Convert nested arrays into JSON strings for INSERT
- TRIES to use correct column type (text, bool, varchar, int, float)
- Build table from generated schema
- Very small RAM footprint - Reads input file line-by-line & outputs insert sql files line-by-line
Extra Notes
Warnings
- Set your length to at least
1024*1024
(roughly one megabyte), or you might end up with several thousand files from large datasets - Setting a length SMALLER than the length of an insert statement... is probably really bad
Column Data ($jtm->generate_schema_info()
)
- Each
is_
entry counts the number of rows that matched thatis_
. -
sampleValue
is the longest found value for that column
<?php
@import(Example.ColumnData)
TODO (maybe)
- a function like
create_and_insert_from_json($json_file_path, $pdo)
... to just have a one-liner that does it all - Write bash script to execute generation from cli
- Write a bash script to compress, upload, & uncompress the sql files
- Write a bash script that will import all the sql files into the database