Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "gzip compression"
-
My worst experience was at my job where they told me I have to move to a permanent position from 3 years of contracting without a specific offer.
Why is that bad? In my country it means approximatly 40% lower wage.
I came into the job with PHP knowledge when they were looking for Perl on a project one year behind schedule. I learned the language and finished working demo in 6 weeks.
After that, every project that was ever assigned to me was done within 5-15% of the allocated time. I'm not kidding here. My manager loved be, because I was reliable, fast and I even 'accidentaly' solved other problems, like for instance I developed simple syslog search tool and benchmarked zip algos for reading speed, and the fastest had 70% better compression than the algo used before (gzip into plzip on 1-2gb files). That solved anothet problem - syslog servers did not have enough disk space and they didn't have money to upgrade the server.
The number of projects I touched or developed was over 20.
I also lead and developed our team's most successful tool, that every customer was throwing money to buy, while cutting down costs everywhere.
And after three years of that, my manager says that there are no more money for contractors. And the only possibility is going for employment. Without any specific offer! Just 'we cant do this anymore'.
Which I understand, that can happen in corporation, but ffs after all I've done, I expected warmer attitude. Not like 'you may have to leave, since we do not really care'.
I liked the people there, even though the corporation environment was lacking in many respects, but I wanted to help our local branch with everything I could and they gave up on me like that.
So I started looking elsewhere and I found a startup which offered 6 times the money I had in my previous job and promises to relocate me to USA. Which is the best thing that has happened to me that year and second best in my whole life!3 -
TLDR; i wrote recursive compression with random algorithm to fuck up some lazy ass girl.
one day, unknown classmate told me she has a family reunion and cannot do his programming assignment which will be collected next day in the morning, so she ask me to do it. i say i need to put a price tag to it because i want to buy a new RasPi --i don't know her either so i don't feel bad about it. i told her i need $20 and after some bargaining it settled at $15. i work on it about 3 hours and told her it's finished and send her demo video as a proof. she happy with the result. and will come to my house later that night to get the source code. at night, she came, and give me only $8 bucks, of course i get mad, with every arguments she throws at me i resist to give her the source code. but since i tired enough to get into a longer arguments i accept the 8 bucks i go upstairs to get the source code. but instead of giving her the actual source code; i wrote a quick script to do 50 compress source code folder recursively with random compression algorithm--sometimes gzip sometimes lzma. and give her the final 50 times compressed source code. EAT SHIT MOTHERFUCKER11 -
Today I wanted to activate the gzip compression on the site of a customer before delivery.
Unfortunately the hosting service (the most popular in France) did not activate this module on its servers. Why in 2016, this module is not enabled by default !15 -
Why isn't gzip compression on by default on servers? Cannot think of any case, where this is what the user wishes?5
-
Me: the web app is downloading a lot of static content while loading the page, leading to the app being very slow in low bandwidth locations. can you ensure compression is enabled while serving static files ?
UI Developer: sure, I'll look into that. Btw, I have a question reg that.
Me: yes, pls.
UI Developer: once the compressed static files are downloaded to the browser, should I write a separate module to uncompress them ?
Me: :-(Strategic Facepalm) -
So for the past day I've been obsessed with adding compression into my build pipeline for web dev, I've implemented html minify, babel JS minify, gzip and made the server specify content length to prevent chunking and shave off a few unused bytes. Is there anything else anyone can think of to get even more gains? Sofar my project went from 1.33mb to 180kb transferred. It's a huge win, just wondering if I can push it further somehow?1
-
Oh boyyy, I just had to work with Asterisk again. And holy shit it is still the clusterfuck it was many years ago.
We got:
- Inconsequent documentation that is mixed through all versions.
- The config sprinkled over what feels like 20 gazillion files.
- AEL being a half assed attempt at a "pRoGRamMinG LanGuAgE"
- The fuck you mean with extensions, endpoints and AOR's?
- Inconsistent config parameter naming. Some are snake case, some camel case some are just everything smushed into a single word.
- queue_log determines wheter to write a log to a file. queue_log_to_file Says to do so independent of you having a realtime backend. Whatever the fuck that is.
- Log compression is done by executing a gzip command after a rotation??!!?!! -
Just switched from gzip to brotli compression and I have to say I am impressed! Google may suck in some regards but brotli is awesome ☺️2
-
Just wanted to free up some space and separate all of my projects.
First idea ... failed!
mksquashfs /home/tracktraps/Development/myproject1 ~/Squash/myproject1.sfs -info -progress -b 1048576 -comp xz -Xdict-size 100%
mkdir /mnt/myproject1
mount ~/Squash/myproject1.sfs /mnt/myproject1
unionfs -o allow_other,nonempty ~/.unionfs/changes/myproject1=rw:/mnt/myproject1/=ro ~/Development/Project1
Too much cpu overhead, too many folders, can't delete files, all get mixed up ...
Second idea ... failed!
dd if=/dev/zero of=~/Imgs/myproject1.btfs bs=1M count=10240
mkfs.btrfs ~/Imgs/myproject1.btfs
mount -o defaults,noatime,autodefrag,compress,compress-force,inode_cache ~/Imgs/myproject1.btfs ~/Development/Project1
Well ... little overhead, gzip compression, saved a lot of space, but fixed img size.
Third idea ... yay!
truncate -s 200G ~/Imgs/myproject1.btfs
mkfs.btrfs ~/Imgs/myproject1.btfs
mount -o defaults,noatime,autodefrag,compress,compress-force,inode_cache ~/Imgs/myproject1.btfs ~/Development/Project1
Well ... little overhead, gzip compression, saved a lot of space ... but wait ... why do my btfs files consume more and more space?
Hmm ... time for a little bash and my beloved systemd timers.
for f in `find . -type f -name "*.btfs"`
do
project=${$f%.*}
btrfs balance start -v -dusage=100 ~/Development/$project
btrfs balance start -v -musage=100 ~/Development/$project
fstrim ~/Development/$project
fallocate -d -v $f
done1 -
Would you consider compression (gzip static files) as a prepare step of the deploy stage or as a part of the build stage?
It's somewhat irrelevant but its bugging me8