Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "cold storage"
-
At the data restaurant:
Chef: Our freezer is broken and our pots and pans are rusty. We need to refactor our kitchen.
Manager: Bring me a detailed plan on why we need each equipment, what can we do with each, three price estimates for each item from different vendors, a business case for the technical activities required and an extremely detailed timeline. Oh, and do not stop doing your job while doing all this paperwork.
Chef: ...
Boss: ...
Some time later a customer gets to the restaurant.
Waiter: This VIP wants a burguer.
Boss: Go make the burger!
Chef: Our frying pan is rusty and we do not have most of the ingredients. I told you we need to refactor our kitchen. And that I cannot work while doing that mountain of paperwork you wanted!
Boss: Let's do it like this, fix the tech mumbo jumbo just enough to make this VIP's burguer. Then we can talk about the rest.
The chef then runs to the grocery store and back and prepares to make a health hazard hurried burguer with a rusty pan.
Waiter: We got six more clients waiting.
Boss: They are hungry! Stop whatever useless nonsense you were doing and cook their requests!
Cook: Stop cooking the order of the client who got here first?
Boss: The others are urgent!
Cook: This one had said so as well, but fine. What do they want?
Waiter: Two more burgers, a new kind of modern gaseous dessert, two whole chickens and an eleven seat sofa.
Chef: Why would they even ask for a sofa?!? We are a restaurant!
Boss: They don't care about your Linux techno bullshit! They just want their orders!
Cook: Their orders make no sense!
Boss: You know nothing about the client's needs!
Cook: ...
Boss: ...
That is how I feel every time I have to deal with a boss who can't tell a PostgreSQL database from a robots.txt file.
Or everytime someone assumes we have a pristine SQL table with every single column imaginable.
Or that a couple hundred terabytes of cold storage data must be scanned entirely in a fraction of a second on a shoestring budget.
Or that years of never stored historical data can be retrieved from the limbo.
Or when I'm told that refactoring has no ROI.
Fuck data stack cluelessness.
Fuck clients that lack of basic logical skills.5 -
The solution for this one isn't nearly as amusing as the journey.
I was working for one of the largest retailers in NA as an architect. Said retailer had over a thousand big box stores, IT maintenance budget of $200M/year. The kind of place that just reeks of waste and mismanagement at every level.
They had installed a system to distribute training and instructional videos to every store, as well as recorded daily broadcasts to all store employees as a way of reducing management time spend with employees in the morning. This system had cost a cool 400M USD, not including labor and upgrades for round 1. Round 2 was another 100M to add a storage buffer to each store because they'd failed to account for the fact that their internet connections at the store and the outbound pipe from the DC wasn't capable of running the public facing e-commerce and streaming all the video data to every store in realtime. Typical massive enterprise clusterfuck.
Then security gets involved. Each device at stores had a different address on a private megawan. The stores didn't generally phone home, home phoned them as an access control measure; stores calling the DC was verboten. This presented an obvious problem for the video system because it needed to pull updates.
The brilliant Infosys resources had a bright idea to solve this problem:
- Treat each device IP as an access key for that device (avg 15 per store per store).
- Verify the request ip, then issue a redirect with ANOTHER ip unique to that device that the firewall would ingress only to the video subnet
- Do it all with the F5
A few months later, the networking team comes back and announces that after months of work and 10s of people years they can't implement the solution because iRules have a size limit and they would need more than 60,000 lines or 15,000 rules to implement it. Sad trombones all around.
Then, a wild DBA appears, steps up to the plate and says he can solve the problem with the power of ORACLE! Few months later he comes back with some absolutely batshit solution that stored the individual octets of an IPV4, multiple nested queries to the same table to emulate subnet masking through some temp table spanning voodoo. Time to complete: 2-4 minutes per request. He too eventually gives up the fight, sort of, in that backhanded way DBAs tend to do everything. I wish I would have paid more attention to that abortion because the rationale and its mechanics were just staggeringly rube goldberg and should have been documented for posterity.
So I catch wind of this sitting in a CAB meeting. I hear them talking about how there's "no way to solve this problem, it's too complex, we're going to need a lot more databases to handle this." I tune in and gather all it really needs to do, since the ingress firewall is handling the origin IP checks, is convert the request IP to video ingress IP, 302 and call it a day.
While they're all grandstanding and pontificating, I fire up visual studio and:
- write a method that encodes the incoming request IP into a single uint32
- write an http module that keeps an in-memory dictionary of uint32,string for the request, response, converts the request ip and 302s the call with blackhole support
- convert all the mappings in the spreadsheet attached to the meetings into a csv, dump to disk
- write a wpf application to allow for easily managing the IP database in the short term
- deploy the solution one of our stage boxes
- add a TODO to eventually move this to a database
All this took about 5 minutes. I interrupt their conversation to ask them to retarget their test to the port I exposed on the stage box. Then watch them stare in stunned silence as the crow grows cold.
According to a friend who still works there, that code is still running in production on a single node to this day. And still running on the same static file database.
#TheValueOfEngineers2 -
Two things before this all:
- I fucking love gitlab so far
- I miss the fuzzy searching from sublime text, as vsCode still can't do it properly..
I was fed up with all the shitty overbloated git deployment scripts, sync scripts, automatic backup solutions and hosted git servers out there, so now my own solution is:
- remote git cloned local files
- local files are synced via dropbox, to easily edit them on any device
- all changes and deleted files are saved up to 1 year on dropbox
- remote has gitlab running and webhooks setup
- the webhooks point to my node scripts, which then rebase the code to its dedicated dev server
- daily server backup with 7 days roll
- cold storage backup each 30 days
Sounds like overkill, but from my experience, you really can't have enough places that have a backup, especially coldstorage backups.
My goal in general though is to have everything on my computer backupped and ready to go asap, if something happens.
I wanted to just use a virtual machine for development stuff, but that wouldnt be able to run on my laptop, so I need a more general solution, where I sync all configs and all projects across. (and have some sort of basic list of tools needed, so I dont need to remember them)
Found for example something for vscode to sync its settings and plugins via any sort of git, will give it a try in near future too.7 -
Progress since my last post - quite like where the iterations since the cold design got me, next is the actual rant view and then will implement all sort of auth things like posting comments.
Have also figured out a way to have style and script plugins, haven't tried it yet though, especially with storage rulesets it might not be as smooth as I imagine it to be.
Another screenshot in the comments.2 -
What do you consider the best method for long-term cold data storage? And why?
I was looking at some magnetic tapes, but the degradation of magnetic charge is worrying me. Lose a single bit and you're fucked.
Dvds - not sure, some of my old ones are no longer readable. Not sure why tho, they had been enclosed in a dark, room-temp place all the time
hdds -- still magnetic.
Ssds? Idk
...?12 -
First job out of school was for a company that did Cold Storage as its main gig and custom dev as a minor form of additional income. I worked with one of the owners and another guy as a three man agile team.
Except, the owner didn't trust source control, so we didn't use it. There was no organization, instead the owner would come in every morning, and assign something new. Randomly, the owner would come in and pitch a fit that something he had assigned 3 weeks before, immediately pulled us off of the next day, and ordered us to DELETE the code for, wasn't done. He treated the other guy on our team as his personal whipping boy. He would sometimes go 2 or 3 days without saying a word to me. No project to work, nothing. I would sit there all day with nothing to do. I stayed there a year.