Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "8 bit computer"
-
29-year veteran here. Began programming professionally in 1990, writing BASIC applications for an 8-bit Apple II+ computer. Learned Pascal, C, Clipper, COBOL. Ironic side-story: back then, my university colleagues and I used to make fun of old COBOL programmers. Fortunately, I never had to actually work with the language, but the knowledge allowed me to qualify for a decent job position, back in '92.
For a while, I worked with an IBM mainframe, using REXX and EXEC2 scripting languages for the VM/SP operating system. Then I began programming for the web, wrote my first dynamic web applications with cgi-bin shell and Perl scripts. Used the little-known IBM Net.Data scripting language. I finally learned PHP and settled with it for many, many years.
I always wanted to be a programmer. As a kid I dreamed of being like Kevin Flynn, of TRON - create world famous videogames and live upstairs my own arcade place! Later on, at some point, I was disappointed, I questioned my skills, I thought I should do more, I let other people's expectations make feel bad. Then I finally realized I actually enjoy a quieter, simpler life. And I made peace with it.
I'm now like the old programmers I used to mock 30 years ago. There's so much shit inside my brain. And everything seems so damn complex these days. Frameworks, package managers, transpilers, layers and more layers of code. I try to keep up. And the more I learn, the more it seems I don't know.
Sometimes I feel tired. Yet, I still enjoy creating things and solving problems with programming. I still have fun learning. And after all these years, I learned to be proud of my work, even if it didn't turn out to be as glamorous as in the movies.30 -
For those who were wondering, since my last post had some people who seemed interested, here's my progress on my 8-bit computer. I've got the clock on the left hand side, and the 2 8-bit registers on the right hand sign. (The power strips in the middle are going to be used as a bus)9
-
After 2 hours of wiring/debugging/rewiring, I have my EEPROM programmer halfway done. Currently is only able to read locations in memory. Next step: make it programmable.
(For those of you who dont know, EEPROM stands for Electrically Erasable, Programmable Read-Only Memory29 -
1. There are 10 types of people in the world: those who understand binary, and those who don't.
2. How many programmers does it take to change a light bulb?
None. It's a hardware problem.
3. A SEO couple had twins. For the first time they were happy with duplicate content.
4. Why is it that programmers always confuse Halloween with Christmas?
Because 31 OCT = 25 DEC
5. Why do they call it hyper text?
Too much JAVA.
6. Why was the JavaScript developer sad?
Because he didn't Node how to Express himself
7. In order to understand recursion you must first understand recursion.
8. Why do Java developers wear glasses? Because they can't C#
9. What do you call 8 hobbits?
A hobbyte
10. Why did the developer go broke?
Because he used up all his cache
11. Why did the geek add body { padding-top: 1000px; } to his Facebook profile?
He wanted to keep a low profile.
12. An SEO expert walks into a bar, bars, pub, tavern, public house, Irish pub, drinks, beer, alcohol
13. I would tell you a UDP joke, but you might not get it.
14. 8 bytes walk into a bar, the bartenders asks "What will it be?"
One of them says, "Make us a double."
15. Two bytes meet. The first byte asks, "Are you ill?"
The second byte replies, "No, just feeling a bit off."
16. These two strings walk into a bar and sit down. The bartender says, "So what'll it be?"
The first string says, "I think I'll have a beer quag fulk boorg jdk^CjfdLk jk3s d#f67howe%^U r89nvy~~owmc63^Dz x.xvcu"
"Please excuse my friend," the second string says, "He isn't null-terminated."
17. "Knock, knock. Who's there?"
very long pause...
"Java."
18. If you put a million monkeys on a million keyboards, one of them will eventually write a Java program. The rest of them will write Perl programs.
19. There's a band called 1023MB. They haven't had any gigs yet.
20. There are only two hard things in computer science: cache invalidation, naming things, and off-by-one errors.10 -
My PM: I don't like when you get up and help out other colleagues with their problems on their computer. You're not at their service.
Me: okay, I'll refrain from doing so.
The next day, I arrive 5 minutes before 8, I get myself a coffee, talk with a few colleagues, and:
PM: Hey, can you please come and help me review this email?
Me: ** fuck it, I still have 2 minutes ** Yeah I'm coming
PM: Now please.
Me: ...
Also my PM, 5 minutes later: Hey I don't manage to print my document, can you help me?
Me: ...
10 minutes later, I get a call:
PM: did you call XY about ZX?
Me: Yep, sent you a mail about it 2 minutes ago
PM: Really? I don't see it
Me: I sent it.
PM: Can you send it again?
Me: ...
Later that day:
PM: Hey, what are you up to?
Me: Well, I'm working on our improved websi-
PM: Can you please create a new campaign on Mailchimp? We're all under water here and a bit of cooperation from you would be great
Me: ** huh? ** erm, ok?
PM: Do it now
Me: Yeah yeah, don't worry. ** click ** here, done. Now, where was I...
----- PM on holidays
Other colleague from another department: Hey Phlisg! I have a small problem on our platform, can you help me?
Me: ** writes a script to help her out **
Her: awesome, thank you!!
Her own PM, 5 minutes later: Hey! Thank you very much for your help, it helps us out a real lot, very much appreciated :)
I lost my smile at work since the beginning of the year, but that little help I gave my colleague just gave my smile back to me :D14 -
I had to open the desktop app to write this because I could never write a rant this long on the app.
This will be a well-informed rebuttal to the "arrays start at 1 in Lua" complaint. If you have ever said or thought that, I guarantee you will learn a lot from this rant and probably enjoy it quite a bit as well.
Just a tiny bit of background information on me: I have a very intimate understanding of Lua and its c API. I have used this language for years and love it dearly.
[START RANT]
"arrays start at 1 in Lua" is factually incorrect because Lua does not have arrays. From their documentation, section 11.1 ("Arrays"), "We implement arrays in Lua simply by indexing tables with integers."
From chapter 2 of the Lua docs, we know there are only 8 types of data in Lua: nil, boolean, number, string, userdata, function, thread, and table
The only unfamiliar thing here might be userdata. "A userdatum offers a raw memory area with no predefined operations in Lua" (section 26.1). Essentially, it's for the API to interact with Lua scripts. The point is, this isn't a fancy term for array.
The misinformation comes from the table type. Let's first explore, at a low level, what an array is. An array, in programming, is a collection of data items all in a line in memory (The OS may not actually put them in a line, but they act as if they are). In most syntaxes, you access an array element similar to:
array[index]
Let's look at c, so we have some solid reference. "array" would be the name of the array, but what it really does is keep track of the starting location in memory of the array. Memory in computers acts like a number. In a very basic sense, the first sector of your RAM is memory location (referred to as an address) 0. "array" would be, for example, address 543745. This is where your data starts. Arrays can only be made up of one type, this is so that each element in that array is EXACTLY the same size. So, this is how indexing an array works. If you know where your array starts, and you know how large each element is, you can find the 6th element by starting at the start of they array and adding 6 times the size of the data in that array.
Tables are incredibly different. The elements of a table are NOT in a line in memory; they're all over the place depending on when you created them (and a lot of other things). Therefore, an array-style index is useless, because you cannot apply the above formula. In the case of a table, you need to perform a lookup: search through all of the elements in the table to find the right one. In Lua, you can do:
a = {1, 5, 9};
a["hello_world"] = "whatever";
a is a table with the length of 4 (the 4th element is "hello_world" with value "whatever"), but a[4] is nil because even though there are 4 items in the table, it looks for something "named" 4, not the 4th element of the table.
This is the difference between indexing and lookups. But you may say,
"Algo! If I do this:
a = {"first", "second", "third"};
print(a[1]);
...then "first" appears in my console!"
Yes, that's correct, in terms of computer science. Lua, because it is a nice language, makes keys in tables optional by automatically giving them an integer value key. This starts at 1. Why? Lets look at that formula for arrays again:
Given array "arr", size of data type "sz", and index "i", find the desired element ("el"):
el = arr + (sz * i)
This NEEDS to start at 0 and not 1 because otherwise, "sz" would always be added to the start address of the array and the first element would ALWAYS be skipped. But in tables, this is not the case, because tables do not have a defined data type size, and this formula is never used. This is why actual arrays are incredibly performant no matter the size, and the larger a table gets, the slower it is.
That felt good to get off my chest. Yes, Lua could start the auto-key at 0, but that might confuse people into thinking tables are arrays... well, I guess there's no avoiding that either way.13 -
Story time! This happened several years ago, back when I didn't have a computer and I was just using the computers at the university. They had 8 iMacs all in a row, and I would sign into one and do my work.
Now these computers have Deep Freeze on them, which is a fancy hard disk driver that treats the entire drive as copy-on-write, so when anything writes to the drive it makes a copy of the block and writes to that instead. That way all your changes are gone when you reboot. It's a real nifty idea, but it's annoying that you have to reset all your settings the way you like them.
So as part of my setup routine I signed into iCloud. This automatically synced my browser history and my email, and various other things I didn't really care about.
One of those things I didn't care about was Find My Mac. I found this out next time I signed into iCloud and saw the university computer on the list. I had never seen these computers on the list before since normally the computer reboots and forgets everything when you log out. What I think happened is the sysadmin forgot to check the "reboot on logout" option in Deep Freeze. So I was like "I wonder what would happen if I passcode locked the computer?" I clicked the passcode lock option and entered 5555, and it seemed to work.
The next day I come in and the particular computer I locked was gone. I thought "oh God what have I done". So I inquired with the sysadmin (who I really hope is not reading this) and he said "oh, someone got into the Find my Mac thing and locked it down. We were trying different codes, since if we couldn't unlock it we'd have to send it to Apple and provide proof of purchase and that could take weeks. We had tried all the obvious ones like 1234 and that wasn't working so I was about to give up, but then I tried 5555 and it rebooted! So yeah, it'll be back soon, and I decided to try installing OS X 10.11 on it because we'll all need to upgrade sooner or later eventually and it's best to have tested a bit first."
So in the end I somehow made it out with my skin still on, and also with El Capitan on one of the computers, which was the only one I used after that. Not so bad! Oh and if you've manged to read all the way through you deserve a cookie 🍪😄1 -
Indian web dev companies suck ( for developers )
when I finished 3 year grad program in computer application here in my country (India), I thought life's gonna be fun working as a developer. Oh boy, I was so wrong.
I started out working for a small service based IT company, followed by 2 more. I realized really quickly that they're nothing short of a scam. If your company's only agenda to somehow survive in the market and showing no signs of growth in 8 fucking years, then I'm sorry you're working for scamsters.
Now I'm not saying that all of them are alike. But most of them sorta are.
They don't give a shit about quality, not one bit. Quality means no money in the short run. And they haven't been able to develop any strategy to deal with that. Hence, no growth.
They promise 100 things on their website but only provide shitty services in 10.
There is no pair programming, no code review, no code quality check, no architect, no database designer. They won't give you extra time to write test cases. They use git as a storage device.
They don't put their developers (especially the ones who are learning) under any sort of managed development framework to ensure smooth work.
At the end of the day, their main objective is to somehow NOT deliver a project but finish a milestone and make money out of it.
After cashing out for a milestone, they want you to put your current project on hold and start working on a new project until you have like 10-15 projects in the pipeline and you're severely overwhelmed and you just wanna fucking QUIT.
They would say YES to literally every fucking thing, only to disappoint the client later.
I can't believe someone in the US, or UK thought it'd be a good idea to approach these companies
for their brand new app ideas. They're so fucked.
They're rarely finishing any project.
I'm sorry if I hurt your feelings. I had to get it out of my system.11 -
Writing an emulator for an 8-bit computer with 8-bit memory addressing in C. Or maybe writing a web server in C... Both were really fun and I learned a lot. (But I love C, so there’s that)2
-
I've been lurking on devrant a while now, I figure it's time to add my first rant.
Little background and setting a frame of reference for the rant: I'm currently a software engineer in the bioinformatics field. I have a computer science background whereas a vast majority of those around me, especially other devs, are people with little to no formal computer background - mostly biology in some form or another. Now, this said, a lot of the other devs are excellent developers, but some are as bad as you could imagine.
I started at a new company in April. About a month after joining a dev who worked there left, and I inherited the pipeline he maintained. Primarily 3 perl scripts (yes, perl, welcome to bioinformatics, especially when it comes to legacy code like is seen in this pipeline) that mostly copied and generated some files and reports in different places. No biggie, until I really dove in.
This dev, which I barely feel he deserves to be called, is a biology major turned computer developer. He was hired at this company and learned to program on the job. That being said, I give him a bit of a pass as I'm sure he did not have had an adequate support structure to teach him any better, but still, some of this is BS.
One final note: not all of the code, especially a lot of the stupid logic, in this pipeline was developed by this other dev. A lot of it he adopted himself. However, he did nothing about it either, so I put fault on him.
Now, let's start.
1. perl - yay bioinformatics
2. Redundant code. Like, you literally copied 200+ lines of code into a function to change 3 lines in that code for a different condition, and added if(condition) {function();} else {existing code;}?? Seriously??
3. Whitesmiths indentation style.. why? Just, why? Fuck off with that. Where did you learn that and why do you insist on using it??
4. Mixing of whitesmiths and more common K&R indentation.
5. Fucked indentation. Code either not indented and even some code indented THE WRONG WAY
6. 10+ indentation levels. This, not "terrible" normally, but imagine this with the last 3 points. Cannot follow the code at freaking all.
7. Stupid logic. Like, for example, check if a string has a comma in it. If it does, split the string on the comma and push everything to an array. If not, just push the string to the array.... You, you know you can just split the string on the comma and push it, right?? If there is no comma it will be an array containing the original string.. Why the fuck did you think you needed to add a condition for that??
8. Functions that are called to set values in global variables, arrays, and hashes.. function has like 5 lines in it and is called in 2 locations. Just keep that code in place!
9. 50+ global variables/hashes/arrays in one of the scripts with no clear way to tell how/when values are set nor what they are used for.
10. Non-descriptive names for everything
11. Next to no comments in the code. What comments there are are barely useful.
12. No documentation
There's more, but this is all I can think to identify right now. All together these issues have made this pipeline the pinnacle of all the garbage that I've had to work on.
Attaching some screenshots of just a tiny fraction of the code to show some of the crap I'm talking about.6 -
LONG RANT ALERT, no TL;DR
* Writes an email to colleague about why I can't create a page on our CMS without at least a H1 title. She wants to me to put up an image with text on it (like a flyer), for multiple reasons, I say I need a textless image. *
30 minutes later:
* Casually plans a frontend optimization project, by looking at files on the CMS, in order to make further development easier and less time-taking*
*** EMAIL NOTIFICATION ***
* clicks *
"Hello, this is [Graphic designer] from the company who created the image with text on it. I do not understand why you can't put display:none on your <h1> tag. Also, being a web company, we are used to making themes and my solution of display:none will work. It's pityful to work on a design only to have it stripped out from most of its concept. If you can't do that, do tell me what resolution you need."
My first reaction:
"Dear [Graphic designer], I am managing our corporate identity, our backend and frontend codebase, I am a graphic designer myself, and am also SEO-aware. For at least 8 reasons (redacted, 'cuse too long), I will need an image without text. As told to my colleagues, I need a 72/96 DPI 16:9 ratio image, 1920x1080 is a good start but may be bigger. Also, looking at the image, it'll have to be in JPG, at 100% quality, exported for the web. Our database software will optimize the image by itself."
Reasons are about SEO issues, responsiveness issues, CMS tools issues, backend and frontend issues.
Instead, I sent following email "We can't. Image please."
I mean seriously. A bit of clarity for you:
In my company, nobody has the slightest idea what I do. They don't understand how a computer works (we all know it works by magic, right?). So of course, when one thinks what we don't know, we know it better than the one who knows, my colleague thought our CMS was like a word document, and began telling me how I should display her bible-length text-infected image, by using some inline css styling display:none.
I tell her "nope, because of my 8 reasons". She transmits that to the agency who's done the visual, now I have this [Graphic designer] not understanding that there are other CMSs than Wordpress on the web, and she tells me, me being one of the most aware on this CMS we have, how I should optimize my site?
Fucking shit, she connects on our CMS for 1 second and she'll get cancer since it's so bad. I'm in the process of planning a whole new rewrite so the website is well designed (currently I am modifying a base theme made by an incompetent designer). I know the system by heart and I know what you can, or can't do.
Now I just received an answer: "so it's only a pure technical problem". NO, OUR WEBSITE WAS CODED BY A CHIMPANZEE WHO THOUGHT WEB DEV WAS AS EASY AS WRITING "HELLO WORLD" ON A SHITTY CMS THAT FORCES DEV USERS TO USE A FUCKING CUM-WHITE-THEMED EDITOR TO EDIT THE WHOLE SITE!!!
I can't just sneeze and "oh look, it's working!"1 -
One of my biggest annoyances: people say 8bit to describe anything remotely pixellated. These morons could look at a black and white computer image and say 8 bit6
-
not universal, but works for me:
1. start listening to long video/podcast/talkshow i'm interested in
2. (optional) think about all the physical things i should do, such as cleaning the house, running errands, etc. conclude "nah, i'd rather stay at the computer".
3. open the project i'm working on, thinking "while i listen, i might as well muck about with this for a bit". the key is for the thought to be duration-indeterminate and non-commital, so it feels like an idea for a voluntary idle activity.
4. start mucking around with the project, starting with the simplest smallest tasks, to slowly shift my focus away from what i'm listening to, so it gradually becomes the background thing as the work gets into foreground of my concentration without me even noticing. this also naturally shifts me towards the more important and complicated tasks in the project
5. naturally lose track of time, realizing i've been working for 2 to 3 hours without break only after what i'm listening to ends (sometimes not even then)
6. at that point, take a break, stretch my legs, get some food, watch some 20-30 minute thing with full attention.
7. find a new long-form mostly audio thing to listen to, and go to step 4. repeat.
8. i found i can work like this 8 to sometimes 20 hours straight in a nice atmosphere, without feeling like i spent the time working with all the mental exhaustion it brings, instead it feeling like "i was listening to interesting/entertaining things and mucking around with some stuff on the side", with all the feeling of "i've been idling the whole time" except the work is actually done, or at least i made a progress. it feels almost like procrastinating except without the guilt because i can see i've done a lot through that time. kind of a good compromise between total procrastination and working your ass off into complete anxiety/depression2 -
Best way to learn a subject/skill is to build projects. How about building your own computer from scratch to learn how computers work.
https://eater.net/8bit/ this tutorial teaches you to build an 8-bit computer from scratch 😍 -
Age 8 - Gets first computer and struggles with dial up Internet and my parents yelling at that they ended to use the phone
Ages 12 to 18 - Gets first laptop, starts messing around and interested in websites, gets involved with SMF, and open source message board system written in Php, and starts helping people out, eventually getting paid work for setting up websites etc.. which lead onto learning html/CSS and picking up bits and pieces of Php (and also Photoshop/Illustrator etc..)
Age 18 - Goes to college to study Multimedia, refreshes knowledge of HTML/CSS, learns a bit of Actionscript and some PHP
Age 20 - finishes Multimedia degree, ends up working as an IT consultant for a small business, which leads me to pick up a bit of bash scripting, small hit more PHP. Leaves this after 3months and decides to do a small Software Dev course. Get my first taste of Java and Visual Basic there
21 - Enter into a Software Dev degree. Dive deep into Java and a small bit of Javascript.
23 - After 2nd year of college get taken on an internship with a large multinational where I learn and get hands on experience with Angular, JS, Coffeescript and C#
Present Day - currently coming up to the end of my degree and can switch between Java, C#, Python, Coffeescript/Javascript (front-end or Node) , C and Golang, C and Python introductions from college modules which I kept playing with in my spare time, Golang I just heard of and decided to write a few things in it because why not, I've picked up various frameworks (spring, echo, express etc.) at some point. I basically learn by doing, if something interests me and I enjoy it, I seem to pick it up quickly by diving in and trying to use it.1 -
Early 1970s, when I was around 8 years old. I read about Artificial Intelligence and it blew me away. I knew nothing about computers, other than I wanted to program them.
I still have old computer magazines, starting from around 1978 not long after the microcomputer revolution started.
My first computer had 2K RAM. That's 2048 bytes. I expanded the memory 1K at a time, and it took 2 chips - they were 4 bits by 1024 so you needed 2 chips to have 8 bit wide memory.
2114 static ram, 300ns.
I think they still make them!6 -
PC Trivia-
1. What does a baby computer call his father?
Data
2. What do you call a computer superhero?
A Screen Saver
3. Why did the computer cross the road?
To get a byte to eat
4. Why did the computer get glasses?
To improve its websight
5. Why did the computer sneeze?
It had a virus
6. Where do computers go to dance?
The disk-o
7. Why did the computer squeak?
Because someone stepped on its mouse
8. What happened when the computer geeks met?
It was love at first site
9. What is an alien’s favorite place on a computer?
The space bar
10. What’s the best way to learn about computers?
Bit by bit3 -
Top 12 C# Programming Tips & Tricks
Programming can be described as the process which leads a computing problem from its original formulation, to an executable computer program. This process involves activities such as developing understanding, analysis, generating algorithms, verification of essentials of algorithms - including their accuracy and resources utilization - and coding of algorithms in the proposed programming language. The source code can be written in one or more programming languages. The purpose of programming is to find a series of instructions that can automate solving of specific problems, or performing a particular task. Programming needs competence in various subjects including formal logic, understanding the application, and specialized algorithms.
1. Write Unit Test for Non-Public Methods
Many developers do not write unit test methods for non-public assemblies. This is because they are invisible to the test project. C# enables one to enhance visibility between the assembly internals and other assemblies. The trick is to include //Make the internals visible to the test assembly [assembly: InternalsVisibleTo("MyTestAssembly")] in the AssemblyInfo.cs file.
2. Tuples
Many developers build a POCO class in order to return multiple values from a method. Tuples are initiated in .NET Framework 4.0.
3. Do not bother with Temporary Collections, Use Yield instead
A temporary list that holds salvaged and returned items may be created when developers want to pick items from a collection.
In order to prevent the temporary collection from being used, developers can use yield. Yield gives out results according to the result set enumeration.
Developers also have the option of using LINQ.
4. Making a retirement announcement
Developers who own re-distributable components and probably want to detract a method in the near future, can embellish it with the outdated feature to connect it with the clients
[Obsolete("This method will be deprecated soon. You could use XYZ alternatively.")]
Upon compilation, a client gets a warning upon with the message. To fail a client build that is using the detracted method, pass the additional Boolean parameter as True.
[Obsolete("This method is deprecated. You could use XYZ alternatively.", true)]
5. Deferred Execution While Writing LINQ Queries
When a LINQ query is written in .NET, it can only perform the query when the LINQ result is approached. The occurrence of LINQ is known as deferred execution. Developers should understand that in every result set approach, the query gets executed over and over. In order to prevent a repetition of the execution, change the LINQ result to List after execution. Below is an example
public void MyComponentLegacyMethod(List<int> masterCollection)
6. Explicit keyword conversions for business entities
Utilize the explicit keyword to describe the alteration of one business entity to another. The alteration method is conjured once the alteration is applied in code
7. Absorbing the Exact Stack Trace
In the catch block of a C# program, if an exception is thrown as shown below and probably a fault has occurred in the method ConnectDatabase, the thrown exception stack trace only indicates the fault has happened in the method RunDataOperation
8. Enum Flags Attribute
Using flags attribute to decorate the enum in C# enables it as bit fields. This enables developers to collect the enum values. One can use the following C# code.
he output for this code will be “BlackMamba, CottonMouth, Wiper”. When the flags attribute is removed, the output will remain 14.
9. Implementing the Base Type for a Generic Type
When developers want to enforce the generic type provided in a generic class such that it will be able to inherit from a particular interface
10. Using Property as IEnumerable doesn’t make it Read-only
When an IEnumerable property gets exposed in a created class
This code modifies the list and gives it a new name. In order to avoid this, add AsReadOnly as opposed to AsEnumerable.
11. Data Type Conversion
More often than not, developers have to alter data types for different reasons. For example, converting a set value decimal variable to an int or Integer
Source: https://freelancer.com/community/...2 -
I got enrolled in 'extracurricular activity' in second grade of my elementary school. We were playing some games at first, but later teacher started to show us programming and explained the matter very well considering we all were 8 y olds. I got interested and while others would play games I was coding and solved assignments teacher gave us.
My family thought that computer will make me stupid, thinking it was made just for playing games. They promised me to get me the computer if I had highest grades in school. I did, not all of them but tried really hard to be the best, despite that I waited for years and still being close to have aced every subject in the meantime.
I got my first computer when I was 16.
Since that day I was constantly reminded that I am wasting my life away sitting at this stupid box.
Later when I got the job that was well payed, they acknowledged that they were wrong to do that for majority of my life.
My parents are unable to explain what I do at the job as they were never interested in what I really do. "Something with computers" is most common answer you can hear from them.
My parents are non-technical people and they still don't understand how that box works and God forbid that they buy something online. My father even rejects to use smartphone.
They also thought that I'm no college material despite always being in top 5 students of the year (not class, but whole year).
They had other plans for me, but I was aware of that and didn't gave a f00ck about what they want with my life. I knew what I want and that was all exactly opposite of what my parents would like.
I was not the child they wanted, but was good son, even helped them and worked student jobs to pay some bills and to help them financially and still they struggled so hard to find some flaw to my character and decisions just to make their point but more than often failed miserably and just proved how wrong they were and how they don't think anything trough.
Only one who really supported me was my elder sister as she knew I was doing the right thing! She also did it her way and I am proud of her as both of us were dealing with 2 tough customers.
long rant, but wanted to add one more thing, I was never into sport, but was training tae kwon do and was really into it and was decent at it among my peers. When I was going to national competition, on my way out of the house all I got from my parents was: "why are you even going there when you will immediately loose, is it just to travel a bit?"
TL;DR: my family supported me less in my life than worst phone call you had with IT support at your worse ISP!4