Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
seriously should I just not do anything at all anymore ?
wtf is wrong with all you stupid fucking people ? -
@AvatarOfKaine Half true.
Multithreading is possible - e.g.
https://sqlite.org/threadsafe.html/
@fruitfcker the achilles heel of SQLite is it's weak typing.
https://sqlite.org/datatype3.html/
Other than that - SQLite is pretty much one of the most conformant, feature rich SQL engine available.
Plus it's documentation is well made. -
@fruitfcker presently I’m using the node api and for very simple things
But point is it’s holding up nicely and seems to perform atom-trans pretty quickly -
@IntrusionCM “The threading mode can be selected at compile-time “ hehe
Betcha anything it’s usually distributed in single thread mode -
@IntrusionCM as for typing embedded databases aren’t really meant to be used for heavy hitting things
Although I’m not sure about the success that’s been found but they were trying to embed MySQL and mssql once upon a time. Thoughts ? -
@AvatarOfKaine
Gentoo has the thread safety enabled.
I think that SQLite shines when it comes to "compilation" of data.
E.g. the typical write once, read heavy pattern where you once compile all data, ram it into a database and then just need to read it.
Personally I used it multiple times for migration / statistic projects.
Excel is a PITA when it comes to aggregation from several different data sources...
Python - gather / collection / aggregation
Ram it into SQLite database
Separate python script - reading and enrichment of data
Is especially useful if you need a braindead and secure way to have "milestones" aka versioned database snapshots.
As an life example... Collecting data from elastic search can be done via monitoring - but when thousands of indexes come into play and dozen of terabytes of data, monitoring is a dangerous path.
Collecting _everything_ will lead to severe performance degradation as it easily can aggregate hundreds of megabytes of data...
So instead every week at a specific point of time, gather everything, ram it into SQLite, store the database and thus have a fixed point of time view of all monitoring data.
Can then be used to feed it to e.g. InfluxDb, creation of excel tables, etc. - it is much more flexible and versatile.
As each SQLite database can be a single file, it's dead easy to iterate and connect to each SQLite file to build e.g. a yearly overview or stuff like that.
Just to clarify ... We still use monitoring, but with very careful limitation on what and how much.
actually surprised sqlite works so well for the purpose i'm putting it to.
random