Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "assembly language"
-
"Python is such a hard language. It has so many rules" - Undergraduate Student who sent out mass email to the class
*Professor makes the next assignment in ARM Assembly*10 -
"You'll be learning and working with C++ and Assembly."
I could very well be the only student ever to have been excited by that prospect.7 -
Anyone looking for something interesting to do???
Step 1) understand how basic circuitry works on a bread board nothing too fancy. ( Implement NAND, AND, ADDER, SUBTRACTOR)
Step 2) learn about microprocessors and how OS works
Step 3) learn assembly
Step 4)write a basic assembler and understand how loaders and linkers works !
Step 5) write a kernel with very basic features like memory management and process management and some drivers for IO
Step 5) write an emulator for some simple systems .! ex chip-8.
Step 6) read about compiler theory and automata
Step 7) write a basic Python interpreter that compiles (not interpreter) to native assembly.
Step 8) implement TCP stack .
Step 9) learn as much as u can about complexity measurement ), data structures and algorithms using C or C++ it's very important ( familiarity with pointers and thus computer memory )
Step 10) learn any high level language of choice like Python or Ruby.
Step 11) stop debating over tabs vs spaces , emacs vs vim , angular vs vue, php vs Python , OOps vs procedular vs functional ( just know about all of them and when to use but don't fucking debate over which one is superior )..
Step 12) live happily and be healthy.30 -
Fresh out of college?
Entry-level?
Apply Here!
Junior Web Intern: 16k A Year!
Basic Requirements: HTML, CSS, JavaScript, Node, Angular, JQuery, Bootstrap, Backbone, Handlebars, D3, p3, CMS (WordPress, Wixx), PHP, Java (Android), C++(iOS), openframeworks, openGLSL, Cinder, failed at least two startups, 8086 assembly language.
Recommended requirements:
Git version control
Agile development
Must be able to display example of each requirement.16 -
Java: I'm the complete OOP Language
C: I'm used in most of the places
Python: I am the simplest language that can do wonders...
Assembly level Language: At last you all have to come to me. So all of you STFU.8 -
Still trying to get good.
The requirements are forever shifting, and so do the applied paradigms.
I think the first layer is learning about each paradigm.
You learn 5-10 languages/technologies, get a feeling for procedural/functional/OOP programming. You mess around with some electronics engineering, write a bit of assembly. You write an ugly GTK program, an Android todo app, check how OpenGL works. You learn about relational models, about graph databases, time series storage and key value caches. You learn about networking and protocols. You void the warranty of all the devices in your house at some point. You develop preferences for languages and systems. For certain periods of time, you even become an insufferable fanboy who claims that all databases should be replaced by MongoDB, or all applications should be written in C# -- no exceptions in your mind are possible, because you found the Perfect Thing. Temporarily.
Eventually, you get to the second layer: Instead of being a champion for a single cause, you start to see patterns of applicability.
You might have grown to prefer serverless microservice architectures driven by pub/sub event busses, but realize that some MVC framework is probably more suitable for a 5-employee company. You realize that development is not just about picking the best language and best architecture -- It's about pros and cons for every situation. You start to value consistency over hard rules. You realize that even respected books about computer science can sometimes contain lies -- or represent solutions which are only applicable to "spherical cows in a vacuum".
Then you get to the third layer: Which is about orchestrating migrations between paradigms without creating a bigger mess.
Your company started with a tiny MVC webshop written in PHP. There are now 300 employees and a few million lines of code, the framework more often gets in the way than it helps, the database is terribly strained. Big rewrite? Gradual refactor? Introduce new languages within the company or stick with what people know? Educate people about paradigms which might be more suitable, but which will feel unfamiliar? What leads to a better product, someone who is experienced with PHP, or someone just learning to use Typescript?
All that theoretical knowledge about superior paradigms won't help you now -- No clean slates! You have to build a skyscraper city to replace a swamp village while keeping the economy running, together with builders who have no clue what concrete even looks like. You might think "I'll throw my superior engineering against this, no harm done if it doesn't stick", but 9 out of 10 times that will just end in a mix of concrete rubble, corpses and mud.
I think I'm somewhere between 2 and 3.
I think I have most of the important knowledge about a wide array of languages, technologies and architectures.
I think I know how to come to a conclusion about what to use in which scenario -- most of the time.
But dealing with a giant legacy mess, transforming things into something better, without creating an ugly amalgamation of old and new systems blended together into an even bigger abomination? Nah, I don't think I'm fully there yet.8 -
Coded in C language for first time (due to college assignments)...
Just found out that there are no strings in c language 😐29 -
Stop teaching assembly first. It may be the underlying language, but your average coder never needs this confusing mess.12
-
I just found a game (have not played it yet) that I think everyone here will cream over.
It's an insanely detailed hardware/ low level/ make-your-own-computer game.
I watched the trailer and it sets you up by teaching you logic gates and basic circuitry.
Then, it eventually teaches you how to build your own computer using these gates.
Then, you start creating your own assembly language using the computer you made.
Then, you use your computer to solve problems like sending a robot through a maze or just building snake on a display.
Absolutely check it out, it's on sale for $13 USD. I just bought it. Turing Complete on Steam.10 -
I learnt programming by making cheats for games and reverse engineering them. It was a fun experience as it wasn't always easy to start with C++ and assembly but it was definitely worth it. Though when you come from a low level language such as C++, looking at highly abstract languages such as Javascript makes everything feel wrong in Javascript, especially when it comes to types and how you can just switch types in the middle of the code :D. But it also gives you an understanding of how Javascript could be implemented, what the engine is doing in the background when you create an object etc..
-
If programming languages were countries, which country would each language represent?
Disclaimer: its just a joke
Java: USA -- optimistic, powerful, likes to gloss over inconveniences.
C++: UK -- strong and exacting, but not so good at actually finishing things and tends to get overtaken by Java.
Python: The Netherlands. "Hey no problem, let'sh do it guysh!"
Ruby: France. Powerful, stylish and convinced of its own correctness, but somewhat ignored by everyone else.
Assembly language: India. Massive, deep, vitally important but full of problems.
Cobol: Russia. Once very powerful and written with managers in mind; but has ended up losing out.
SQL and PL/SQL: Germany. A solid, reliable workhorse of a language.
Javascript: Italy. Massively influential and loved by everyone, but breaks down easily.
Scala: Hungary. Technically pure and correct, but suffers from an unworkable obsession with grammar that will limit its future success.
C: Norway. Tough and dynamic, but not very exciting.
PHP: Brazil. A lot of beauty springs from it and it flaunts itself a lot, but it's secretly very conservative.
LISP: Iceland. Incredibly clever and well-organised, but icy and remote.
Perl: China. Able to do apparently almost anything, but rather inscrutable.
Swift: Japan. One minute it's nowhere, the next it's everywhere and your mobile phone relies on it.
C#: Switzerland. Beautiful and well thought-out, but expect to pay a lot if you want to get seriously involved.
R: Liechtenstein. Probably really amazing, especially if you're into big numbers, but no-one knows what it actually does.
Awk: North Korea. Stubbornly resists change, and its users appear to be unnaturally fond of it for reasons we can only speculate on.17 -
Assembly:
"OMG! What an unfriendly language! It's so obsolete!"
WebAssembly:
"OMG! Assembly is on the web! It's so amazing!"3 -
Back then it was for kids. Today i think it wil be torture for them.
Ps my first language was assembly (not judging)6 -
Yknow, I want to make an android app that I have in my mind for about half a year now and I already tried twice, both with Kotlin and with Java but everytime I try it's just pain and suffering and frustration...
No it's not because of the language, I like Java and I like Kotlin too and I'd say I'm at least decent at Kotlin and really good in Java...
No no.. the issue is the fucking Android SDK and the mix-and-match documentation available online!!!
Every fucking time I want to implement some sort of UI element, user action or a background service and I start googling how to do it It comes with with at least 3 different stack overflow solutions, all of them saying "that way of doing it is deprecated, instead you should X" and looking up the OFFICIAL FUCKING DOCS it will just make me roll up in the corner and cry because of how fucking inconsistent it is and the retarded domain language it uses... fucking transactions for fucking fragments inside fucking activities... because I guess the word "screen"/"view"/"template" or something similar natural just was too mainstream for the all knowing alphabet soup that google is...
And then you start looking up what the fucking difference even is and how to code it up only to find out there's at least 12 other opinions on how fragments should be used and what should be an activity and what should be a damn fragment...
But that's not all, that's just the base... I get a headache even thinking about how the fucking inflating of templates and the entire R. notation works. You want to open a fucking tiny corner menu with the settings options? WELL THEN YOU FUCKING BETTER REMEMBER TO IMPLEMENT IT THROUGH SOME SORT OF EVENT AND INFLATE THE MENU YOURSELF EVEN THOUGH ITS THE SAME FUCKING THING WITH STATIC STRINGS...
AND WHY THE FUCK DO I NEED LIKE 4 NEW FILES TO IMPLEMENT A FUCKING LISTVIEW...
also talking about ListViews... what was wrong with "ListView"... Why do we need a "RecyclerView"... oh right... because the fucks fucked the fuck up and all the legacy components were designed by a monkey and are next to useless! SO WE NEEDED A NEW NAME FOR THE FIXED VERSION, CANT NAME IT LISTVIEW AGAIN... FUCK YOU...
honestly... if I got a dolar for every "what the fuck android" I said during trying to understand that mess I'd be richer by a few hundred...
oh oh oh, but you know what? You don't like the android SDK? that's fine, you can use fucking React or Flutter or something... yeah.. because instead of torturing myself with the android SDK I want to torture myself with an abstraction of the same SDK and JavaScript as the fucking cherry on top... HAVE YOU FUCKING SEEN THE CODE FLUTTER SHOWS ON THEIR WEBSITE AS THE "Introduction" ?!!!
Look at this piece of shit:
[code in attached image, we could really use a proper Markdown support at least for rants]
THAT'S NOT EVEN THE ENTIRE THING, THAT'S JUST THE *REALLY* UGLY PART...
The fucking nesting... What is it with JS and all the fucking nesting everytime?! It looks like shit.... It reads like shit as well...
WHY, in the name OF FUCK, IS THERE MORE THAN 5 ANDROID FRAMEWORKS and ALL of them... used this FUCKING NOVEL idea of programming using A FUCKING BRACKET WALL
It always looks like:
(code(code[code{code(code{code()})}]));
If I wanted to make a fucking app or a website using fucking Haskell I'd do that.... at this point reading assembly code feels like heaven compared to this retardation... Why is this so popular?! WHAT DO YOU PEOPLE SEE IN IT?! Clearly it's not the aesthetics... it looks like a fucking frog vomit running down an emus leg, fuck that.... I don't even hate classic JavaScript, it's a good enough language and it does what I tell it to... but these ugly fucking frameworks like react, angular and whatever else uses this fucking format can go fuck right off. This is not the way JS is gonna get a better name for itself...
So:
Fuck Google
Fuck the marionette that designed the Android SDK
Fuck the Hellspawn the came up with the "functional-like" way of using JavaScript
Fuck everyone that thinks "JavaScript everywhere" is a good thing
And deeply future-fuck everyone that makes a new framework following any of these standards, stucks a .js at the end of the name and releases his hairball.js of an invention into the fucking world....
It's a mess... fuck everything android related...14 -
I am a firmware developer with 4 years experience. C and sometimes assembly is my bread and butter.
Like 2 years ago, I was really interested to make a switch to application development. Got referred by my friend to her startup.
But I was a bit rusty with my data structures, high level languages and interpersonal skills.
The first question was to find the number of occurences for each word in a paragraph. The language choice was Java. But I was allowed to use C++ since it was the closest relative to Java that I knew.
And I started implementing a binary search tree from scratch and started inserting each tokenised word into it, wrote a traversal algorithm.
The interviewer, luckily, was a patient guy. After I completed my whole mess, he asked is it possible to do this in a slightly better way with constant time access without traversal.
I said yes, we can with a hash table but I dont know how to implement one. He replied I dont expect you to implement the hash table but see you use it. I asked him if I am allowed to used the standard library, for which he said ofcourse.
*facepalm*.
Finally I understood his expectation, referred cppreference.com and used an unordered_map.
Later there were some quesion on databases for which I tried my best to answer. And I frankly replied that I am not comfortable with JS frameworks as of now. Got rejected.
So the mistake is I never asked basic questions like what is the time complexity expects, if I was allowed to use standard library, didnt spend some extra time on studying stuffs needed for the domain switch and most importantly I panicked.7 -
Why am I such an average ?
It's just a sad realisation. Nobody cares but I wanna send this out there, just to write thoughts.. I am 18 in 3rd year of high school (grammar school so nothing IT related, basically waste of time) and in IT I'm all self taught but I feel like I could be better if I just didn't [something]..
I feel like I wanna learn so many things but when I look at you, it seems like a common problem in the IT sphere so hey, average guy joining the club.
I also feel dumb when programming. I didn't manage to learn C++ in it's entirety because to really accomplish something, you've got so many ways to do it and finding the best one requires deep understanding of the tools you've got at your disposal with the language and I feel like I'm not capable of this(self learn, in school/Uni that's different story).. But many (most) of you are. I've tried many coding challenges and when I got it working, I just saw how someone did it in one line just by layering functions that I've never heard of..
Also, we've got kinda specific national competition here in many fields including IT for high schools.. And the winners always do sometimes like "AI driven Life simulation" or "Self flying drone made from ATMega from scratch with 3D simulation in C# to it" or "Game engine" or whatever shit and it's always from grammar schools and never IT related schools.. They are like me. Maybe someone helped them, I don't know, but they are just so far away from me while I'm here struggling to get the basic level of math for any kind of machine learning..
Yeah I've written Neural Network from scratch in C but meh, honestly it's pretty basic stuff .. I'd rather understand derivatives which we're going to learn next year and I'm too lazy to learn it from khan academy because I always learn something else.. Like processing (actually codetrain started teaching tensorflow so that might be the light for me...) Or VHDL (guys you can create your own chip / CPU from scratch and it's not even hard and OMFG it's so fucking cool , full adder done yay) or RPi or commodore 64 assembly or game development with Godot and just meh..
I mean, this sounds exactly like not knowing what to do and doing nothing in the end. That was me like 6-12 months ago. Now I'm managing to pick 2-3 things and focus them and actually feel the progress.
But I lost track of the original point.. I didn't do anything special, every time I'm programming something, everyone does it better and I feel dumb. I will probably never do anything special, everyone around says "He's still learning he's genius" but they have no idea.
I mean, have you seen one of the newest videos on Google's YouTube channel (I openly hate them, but I will keep that away for now), something like "Sarah story" ? It's about girl that apparently didn't care about IT but self learned tensorflow on high school. I think it may be bullshit (like ALL of their videos ) but it's probably just fancied, not complete lie.
And again, here I am. I now C but I'm incapable of learning to program good which most of you did and are now doing for living. I'm incapable to do anything cool, just understanding what everybody else did and replicating it. I'm incapable of being clever.
Sorry, just misusing devrant to vent a bit17 -
When your uncle is an Assembly Language programmer, developing Gym Trademills and shit
And he thinks working with Web Technologies is damn easy..
Hahah seriously?
Oh Wait a Min.. :|3 -
Hi.
Programming language types are only two:
- Assembly
- All the rest
I'm destroyed, my brain is melted.
Assembly is hate and love at the same time.2 -
I just installed Opera Mini on my PSP. That alone isn't very exciting on its own, although I am stoked that my website does in fact render on a device from 2009. With the helpful guidance of a laptop from 2004 that's doing the hotspot duties for this thing.
No, what really got me stoked is that Opera still supports these old platforms, and how small they managed to make it. The .jar file for Opera Mini 4.5 is ~800kB large. There's a .jad file as well but it's negligible in size and seems to be a signature of sorts.
Let that sink in for a moment. This entire web browser is 800kB. Firefox meanwhile consistently consumes 800 MEGABYTES.. in MEMORY. So then, I went to think for a moment, how on earth did they manage to cram an entire functioning web browser in 800kB? Hell, what makes up a web browser anyway?
The answer to that question I got to is as follows. You need an engine to render the web page you receive. You need a UI to make the browser look nice. And finally you need a certificate store to know which TLS certificates to trust. And while probably difficult to make, I think it should be possible to do in 800k. Seriously, think about it. How would you go *make* a web browser? Because I've already done that in the past.
Earlier I heard that you need graphics, audio, wasm, yada yada backends too.. no. Give your head a shake. Graphics are the responsibility of the graphics driver. A web browser shouldn't dabble with those at all. Audio, you connect to PulseAudio (in Linux at least) and you're done. Hell I don't even care about ALSA or OSS here. You just connect to the stuff that does that job for you. And WebAssembly.. God I could rant about that shit all day. How about making it a native application? Not like actual Assembly is used for BIOS and low-level drivers. And that we already have a better language for the more portable stuff called C.
Seriously, think about it. Opera - a reputable browser vendor - managed to do it in 800kB on a 12 year old device. Don't go full wank on your framework shit on the comments. And don't you fucking dare to tell me that there's more to it. They did it for crying out loud. Now you take a look at your shitpile for JS code and refactor that shit already. Thank you.21 -
I confess that I know how to manage memory on assembly language, but I never knew how to use the memory button of my Casio calculator :'v Should I be ashamed?7
-
I didn't leave, I just got busy working 60 hour weeks in between studying.
I found a new method called matrix decomposition (not the known method of the same name).
Premise is that you break a semiprime down into its component numbers and magnitudes, lets say 697 for example. It becomes 600, 90, and 7.
Then you break each of those down into their prime factorizations (with exponents).
So you get something like
>>> decon(697)
offset: 3, exp: [[Decimal('2'), Decimal('3')], [Decimal('3'), Decimal('1')], [Decimal('5'), Decimal('2')]]
offset: 2, exp: [[Decimal('2'), Decimal('1')], [Decimal('3'), Decimal('2')], [Decimal('5'), Decimal('1')]]
offset: 1, exp: [[Decimal('7'), Decimal('1')]]
And it turns out that in larger numbers there are distinct patterns that act as maps at each offset (or magnitude) of the product, mapping to the respective magnitudes and digits of the factors.
For example I can pretty reliably predict from a product, where the '8's are in its factors.
Apparently theres a whole host of rules like this.
So what I've done is gone an started writing an interpreter with some pseudo-assembly I defined. This has been ongoing for maybe a month, and I've had very little time to work on it in between at my job (which I'm about to be late for here if I don't start getting ready, lol).
Anyway, long and the short of it, the plan is to generate a large data set of primes and their products, and then write a rules engine to generate sets of my custom assembly language, and then fitness test and validate them, winnowing what doesn't work.
The end product should be a function that lets me map from the digits of a product to all the digits of its factors.
It technically already works, like I've printed out a ton of products and eyeballed patterns to derive custom rules, its just not the complete set yet. And instead of spending months or years doing that I'm just gonna finish the system to automatically derive them for me. The rules I found so far have tested out successfully every time, and whether or not the engine finds those will be the test case for if the broader system is viable, but everything looks legit.
I wouldn't have persued this except when I realized the production of semiprimes *must* be non-eularian (long story), it occured to me that there must be rich internal representations mapping products to factors, that we were simply missing.
I'll go into more details in a later post, maybe not today, because I'm working till close tonight (won't be back till 3 am), but after 4 1/2 years the work is bearing fruit.
Also, its good to see you all again. I fucking missed you guys.9 -
I'm the kind of person that says "Fuck python, worst language, fuck C#, Java, Golang", assembly and C are superior.
But I have learned my lesson; Yesterday I learned enough C# to be able to make a windows app that connects to a another app via sockets. I tried first to do it with C++ but my app looked like shit and took me about a whole day to make. Then I tried with C#, got the App working on an hour, now I'm delighted with C#. I guess I have to be open-minded.8 -
Bind learning c++ chapter 2
Did the guy who made this language know how assembly works or did he just guess along the way?3 -
Ye, so after studying for an eternity and doing some odd jobs here and there, all I can show for are following traits:
* Super knowledgeable in arm/Intel assembly language
* C-Veteran with knowledge of some sick and nasty C-hacks/tricks which would even sour the mood of your grandma
* Acquired disdain of any and all scripting languages (how dare you write something in one line for which I need a whole library for!)
* All-in-all low-level programmer type of guy (gimme those juicy registers to write into!)
After completing the mandatory part of my computer science studies, all I did was immerse myself into low-level stuff. Even started to hold lectures and all.
Now I'm at the cusp of being let free into the open market.
The thing is: I'm pretty sure that no company is really interested in my knowledge, as no one really writes assembly anymore.
Sure, embedded programming is still a thing, but even that is becoming increasingly more abstract, with God knows how many layers of software between the hardware and the dev, just to hide all the scary bits underneath.
So, are there people in here who're actually exposed to assembly or any hands-on hardware-programming?
Like, on a "which bit in which register/addr do I need to set" - kind of way.
And if so, what would you say someone like me should lookout for in a company to match my interest to theirs?
Or is it just a pipe dream, so I'd need to brace myself to a mundane software engineer career where I have to process a ticket at a time?
(Just to give a reference: even the most hardware-inclined companies I found "near" me are developing UIs with HTML5 to be used in some such environment ....)12 -
Tried to reply to @Fast-Nop who had replied to someone wondering if C would be a good first language.
IMHO C should have been put to sleep ages ago. A few years ago I downloaded the latest, greatest C Standard. For a language billed as small and simple (by many) it was over 800 pages long. Still there's a lot that's unspecified like order of evaluation of function arguments. Int etc is implementation dependent. And error handling, let's not go there. The macro assembler throws away all the semantics leaving behind a cryptic value. It's a complex language due to the innumerable interactions possible.
It's been called assembly language for the PDP-11 minicomputer. Recently learned that even the VAX-1 was built from SSI chips like the 4-bit 74181 ALU. The VAX.
Anyway I had several excellent books on programming style written by Henry Ledgard. He despaired of making C look readable. I commend his books which are so old that the code is UPPERCASE A lot of he wrote had to do with program design, naming things, writing good comments and that the visual shape of a program assists mental clarity.23 -
My journey with IT learnings, Some of Major learning changes. The following are the years in which I start learning given technology or domain.
1993 Birth
1999 #HTML
2001 #PHP + Foxpro
2001 #Haskell language
2002 BASIC
2002 #8088 Assembly
2003 #Linux
2007 Visual #Foxpro
2009 #C Language
2010 #Python
2011 #JAVA for mobile #development
2015 Virtual Machines
2016 Networking
2018 #Blockchain
2019 #Elixir & Phoenix
2019 #DevOps19 -
Many people asked me this.
Every programming language is made of another, and because of it is the lowest level language every language is made of it. So what does assembly made of?
...
When you buy a vacuum cleaner they give you instructions to how to use it. When processor producer creates a processor they give an instruction to how to use it. Assembly programming language is nothing but an instruction that processor producer gave us.5 -
Is your code green?
I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.
I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?
* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.
Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.
I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.
In terms of cycles that can be the difference between needing a single core and a thousand cores.
This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.
Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.
I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.
I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.
I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.
The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.6 -
WTF IS WRONG WITH ASSEMBLY LANGUAGE?!
I was just modifying an existing program for adding a sequence of numbers from the data section and through console input. I studied the code and started modifying it one step at a time. I needed to modify it into a multiplication program. So I started by changing the ADD functions, replaced the result and buffer registers with bigger size and thought I completed it. WELL GUESS WHAT? SHIT JUST GIVES ME SEGMENTATION FAULT! NOW I HAVE TO REDO THE WHOLE THING! WHY DOESN'T IT TELL ME WHICH LINE OF THE CODE I FUCKED UP AT?! STUPID NASM COMPILER.9 -
Well, I wanna specialize in low-level software as I get older. Everyone is telling me to go out and learn a processor architecture. I'm willing to be patient, so I do what people recommend to me and I download the Intel x86_64 manual. I was excited... UNTIL I REALIZED THE MANUAL WAS 4474 PAGES LONG! Like, how am I supposed to jump into assembly, machine language, and low-level programing with a beginner's task like that? I cannot find ANY resources online to simplify the transition, and college sure ain't gonna teach me anytime soon.10
-
Just reading Knuth's The Art of Computer Programming, where he created an own assembly language, so the book doesn't rely on a currently in fashion language... Meanwhile teaching students to code a GUI in java swing, because it was the new shit when the syllabus was created...1
-
Hey everyone!
I'm on the hunt for new and exciting languages!
I'll state the ones I already know:
Python, Haskell, C(++), C#, Java, JavaScript, Ruby, Rust, Lua, about every kind of Basic, some branches of Lisp, BrainF**k, assembly, Octo (Chip-8) and GML(basically JavaScript).
I've also learnt some styling languages:
Html, CSS, Markup and Markdown.
Some misc languages too: Regex and a runny bit of the Wolfram Language.
Also I'm kind of limited to Windows, Linux and Android, as I do not own any Apple hardware except I have access to an old iPad, so are languages like Swift still good?
Thanks!28 -
I stumbled across a game called where you have to control a robot using 8086 assembler. The site looks under construction but it looks interesting!
You can check it out here: https://muchassemblyrequired.com/ga...5 -
Now I have a Course called "Microprocessor and Assembly Language" this semester. I'm not understanding much of it from the classes (I don't find our teacher very good at teaching). So couple of days ago she says we have to submit projects at the end of this semester and it must be something related to hardware, not software(as she thinks Assembly is a language for Hardware). Now I have to submit a proposal to her very soon with an idea for the project. I'm killing myself over it but can't find any idea. Can anyone help me regarding this matter?16
-
It's going to be a long rant here and probably my fist rant ! And yes I am pissed up with a community growing in dev world .
There are so called framework experts who are so good that they can spin up a nodejs server with express and mongodb .
So to the people who bash on php , who bash down MySQL for no fuckig reason other than they have heard these are not so cool.fuck yourself incompetent piece of crap!!! I can hear all day about how algorithms and datsructure are not important form these people.fuck you because if you don't know /understand /want to understand the basics of computing how the fuck can your brain be trusted with anyting serious??If you can't write down proofs of basic / standard algorithms and till bash down on people who do those please fuck you because those are the people indirectly responsible for your Job so that u can work on fancy frameworks and cool IDE's .
Instead of whining down dedicate some time to your maturity and knowledge because that what we devs are all about.we like solving problems right?.
I repeat if you are anything like stating up it career in mid 20s maybe.leave everything if you can .Forget all fucking frameworks and technologies start with basics of computing, right at instruction level using assembly .Then move to a higher language when u know and reason about what your CPU is actually doing.
If you can't do that and keep on crying and bashing down things wihout proper explanations fuck yourself with a cactus .5 -
I remembered one time my freakin prof in programming taugh us how to understand computer language, that time my worst enemy is ASSEMBLY, for some reasons my teacher doesn't know how to code in assembly like wtf?
On our last grading period he asked us to create a program using mov and shift and the deadline is set tomorrow after he announced it.
I remember my code in that freaking subject
MOV COURSE
SHIFT SCHOOL
HAHAHAHA after that I was scolded big time 😂 -
As a junior dev from a sysadmin and security background, this is a list of software development concepts I never seemed to truly understand but hope to(rated from most intimidating to least):
1) Frontend web development and all the huge world of javascript frameworks and tools. - It's more overwhelming than the political geography of the Holy Roman Empire in the Middle Ages.
2) Machine Learning, Deep Learning and A.I- too much math that fucks with my brain.
3) low-level programming(kernel,drivers) - sounds extremely interesting but the code in assembly/C/C++ looks like Linear A Minoan hieroglyphics.
4) Rx(insert language here) - I never get why it is useful or why someone invented this. Seems interesting though.
5) Code Reflection - sounds like Thelemic magick.
6) Packaging, automation, build tools, devops, CI, Testing -seems too complicated. I just want to run an executable at the client or make a web app that does something. Why all this process?6 -
Writing a full interpreter for a pseudo-assembly language.
It's kinda fun but the things a bit of a cunt of a problem because I only did this once before almost a decade and a half ago.
getting just the right format and deciding on the syntax is a slog. Just need something thats quick and sufficiently expressive because I'm not writing the assembly myself, I'm generating the assembly code that runs through the interpreter, so it has to be valid under a lot of conditions. -
I had a phone interview and he was shocked that I hadn't learned C, C++ or an assembly language in college. anyone else find that strange?5
-
It's been 5 years this month since I started learning programming, getting interested after learning about Linux, wanting to do operating systems and games.
I started with C++, went on to C and assembly language for about 2 years and gave up on it for the most part.
Afterward did Java for two years and hated every second of it! Switched to Python instead (been using it since 2.7.5).
Now I do Haskell and JavaScript and those languages do everything so much easier I can never see myself ever going back!2 -
is Bachelor or Master Degree necessary for a web developer job?
ps: i am currently persuing BCA degree 5th sem and it has so many subjects i dont like or not related to my Aim(like microprocessor and assembly language). So.. dear seniors what do you recommend me?29 -
Okay so I don't really know what I did but somehow my Rust executable that implements a parser for an assembly language all the sudden needs more than a minute to compile and is over 700 MiB in size 🤠4
-
Intel 8085 micro-processor, anyone?
In my graduation, one of the semesters had Intel 8085 programming in the curriculum. It's because of that dev-kit I understood what assembly-level language means.
A simple scenario of adding two numbers would result in half a page long sequence of commands that literally didn't excuse any mistakes.
It made me understand the semantics or basically what we get taught as "middle level" languages.
We had to memorize the exact pins of the thing and had to draw it from memory. And we had to learn the instruction set it had.
Later we had to learn Intel 8086 but its instruction set was way too complicated and I gave up on it.
I know it sounds geeky but I randomly remembered it today.13 -
Whose fucking idea was it to still consider assembly (with C being optional) as the most relevant language in electrical engineering school?
Also teaching like 74HC and Op-Amp IC's are still the most common thing in todays electronis is really grinding my gears!!! Is it still an argument that your 8 NAND gates are essentially the same price as a low cost Microcontroller?
But one can be modified within second and the other you potentially need to redesign the entire board.12 -
Not drunk, still underage. I can share a wonderful story from when my wisdom teeth were pulled. My mom made me agree to not work on anything, especially code, when I came home. However, I had an awesome CPU architecture design project, and I was ready to make a few example programs in the assembly language. When I woke up the next morning with a clearer mind, I looked at the code again. ^A , ^X. Mom said "I told you so."
-
I wanna make a c+friends language and it'd be dev friendly and will throw lots of errors on compile to show love. Also it'll compile slower with each newline so you can always say "it's compiling" there will be classes but people instead and then instead of new I'll have create. As for loops let's go with a friendly do while loop and dontdo while as normal while or dowith i while to have a friendly for loop. Instead of ifs let's say decide() and instead of else let's have or. Instead of functions I'll have well you need no functions you'll have jumps and tests before jumps just like assembly has. Oh and everything will be a pointer because then it runs nicer. To create a variable you can't use = because that's the equal sign in decide you need to use "var int myint is 69" because why not. Then to print to the console "console.outputstream.out(myint)" instead of threads I'll have please like "please work" where work is a jump target. I hope you'll enjoy this language ^^
-
the one that exists (c#) seems underused compared to where it could (or even should) be used. and the place that uses it the most (enterprise) butchers and mangles its use, just as enterprise tends to do with everything.
the one that i'm designing... the fact that it doesn't exist yet, and that even as i'm zeroing in on syntax and philosophy that i'm very much starting to be proud of, i still don't have a proper idea of how to implement even the most basic parser/interpreter for it, not because it's in any way difficult or unusual, but just because... i've never done that before, so i get into weird circular thought paths that produce weird nonsensical code...
... on top of that, i still only have a very, very fuzzy idea of how will it (sometime in extremely distant future) actually implement the most interesting and core feature - event-based continuous (partial) re-parsing of the source code and the fact that traversing the tokens at the leaf level of the syntax tree should result in valid machine code (or at least assembly) that is the "compiled" program.
i *know* it's possible, i just don't yet know enough to have a contrete idea how exactly to achieve it.
but imagine - a programming language where interactive programming is basically the default way of working, and basically the same as normal programming in it, except the act of parsing is also the (in-memory) compilation at the same time, so it's running directly on the hardware instead of via interpretrer/vm/any of that overhead crap.
also then kinda open-source by definition.
and then to "only" write an OS in that, and voilá! a smalltalk-like environment with non-exotic, c-family syntax and actual native performance!
ahhh... <3
* a man can dream *2 -
The C language combines all the power of assembly language with all the ease-of-use of assembly language.
-
Its a confession...
So yesterday we had a practical in our uni... It was on Assembly Language (NASM and TASM)... Its a horrible language to work on... Trust me... I hate it, infact... We all hate it at the uni... But the thing is... We need to pass the practical in order to sit for the theory, and it is really hard language.... So most of my friends brought pen drives... And some brought chits... And sadly... All of them got caught... And were marked as fail right away... But the thing is I also cheated... And I copied successfully... I didnt use any pendrive or removable media... But I used ssh to my cloud server... And since I code on vi, it was pretty easy for me to cheat in the practical... I feel bad that I cheated.... But then I feel proud as well because I used the tech of this generation to copy, and not some grandpa shit like pendrives...
Yeah... That was it... The codes did rain in the exam..
I know I am a horrible person.. But common guys.. Who am I kidding... I am proud that I didnt use any clichè methods... And was talented enough to do so without getting caught...5 -
Know what really grinds my gears?
People who refer to "ajax" as though it's a separate programming language, instead of what it is, which is an old shitty method in an old shitty library. What I do enjoy is people thinking it's dish soap. That will *never* not be funny to me.
Examples:
1. *generic job description*...5 years experience. Desired skills: HTML, Foundation, PHP, Ajax, Fortran, Assembly, Tagalog, smoke signals.
2. Someone in "marketing": "Do you know Ajax?"
3. Jackass in a coffee shop who uses moustache wax: "I'm an ajax programmer. Yeah I've heard of [any recent band], like twenty years ago. They suck."
Go die, and take ajax with you.2 -
wtf is it with CSS?
It's so freaking tedious to deal w/ all the shit of it down to the most minute detail, how did anyone ever have the patience to make it and use it.
It's like assembly language, so no one should be cursed with having to deal directly with it. Fuck, that there are people with brains that can tolerate it, thus making it live on. No offense, if my brain were that way, it would probably be useful for me, but fucking aye.11 -
So I'm having trouble keeping up with a game I'm working on and im struggling to find a new job, soooooo I decided fuck it and began concepting building a fantasy console with a custom backend language slightly modelled off of Assembly that you would then create a front end of...
Going to build it using GML and then build a GML front language because fuck it, why not! -
That feeling when your friends' college life kind of depends on you helping them out in this assignment using a low level programming language (low level means it was meant to operate on the machinery level) that you were really good in at the first semester. Then you realize that you have forgotten a lot of things just because the logic and approach ist totally different from the high level programming language and you forget how a programming language works once you stop using it and it takes time to dive back in and you really like being friends with them. Now all you're left with is with the fear of letting them down.
-
Can anyone help me with this theory about microprocessor, cpu and computers in general?
( I used to love programming when during school days when it was just basic searching/sorting and oop. Even in college , when it advanced to language details , compilers and data structures, i was fine. But subjects like coa and microprocessors, which kind of explains the working of hardware behind the brain that is a computer is so difficult to understand for me 😭😭😭)
How a computer works? All i knew was that when a bulb gets connected to a battery via wires, some metal inside it starts glowing and we see light. No magics involved till now.
Then came the von Neumann architecture which says a computer consists of 4 things : i/o devices, system bus ,memory and cpu. I/0 and memory interact with system bus, which is controlled by cpu . Thus cpu controls everything and that's how computer works.
Wait, what?
Let's take an easy example of calc. i pressed 1+2= on keyboard, it showed me '1+2=' and then '3'. How the hell that hapenned ?
Then some video told me this : every key in your keyboard is connected to a multiplexer which gives a special "code" to the processer regarding the key press.
The "control unit" of cpu commands the ram to store every character until '=' is pressed (which is a kind of interrupt telling the cpu to start processing) . RAM is simply a bunch of storage circuits (which can store some 1s) along with another bunch of circuits which can retrieve these data.
Up till now, the control unit knows that memory has (for eg):
Value 1 stored as 0001 at some address 34A
Value + stored as 11001101 at some address 34B
Value 2 stored as 0010 at some Address 23B
On recieving code for '=' press, the "control unit" commands the "alu" unit of cpu to fectch data from memory , understand it and calculate the result(i e the "fetch, decode and execute" cycle)
Alu fetches the "codes" from the memory, which translates to ADD 34A,23B i.e add the data stored at addresses 34a , 23b. The alu retrieves values present at given addresses, passes them through its adder circuit and puts the result at some new address 21H.
The control unit then fetches this result from new address and via, system busses, sends this new value to display's memory loaded at some memory port 4044.
The display picks it up and instantly shows it.
My problems:
1. Is this all correct? Does this only happens?
2. Please expand this more.
How is this system bus, alu, cpu , working?
What are the registers, accumulators , flip flops in the memory?
What are the machine cycles?
What are instructions cycles , opcodes, instruction codes ?
Where does assembly language comes in?
How does cpu manipulates memory?
This data bus , control bus, what are they?
I have come across so many weird words i dont understand dma, interrupts , memory mapped i/o devices, etc. Somebody please explain.
Ps : am learning about the fucking 8085 microprocessor in class and i can't even relate to basic computer architecture. I had flunked the coa paper which i now realise why, coz its so confusing. :'''(14 -
To all those senior programmers out there do you think learning assembly can benefit you in landing a job? It seems like an useless language to me other than it might be fun to play around with kernels6
-
My answer to their survey -->
What, if anything, do you most _dislike_ about Firebase In-App Messaging?
Come on, have you sit a normal dev, completely new to this push notification thing and ask him to make run a simple app like the flutter firebase_messaging plugin example? For sure you did not oh dear brain dead moron that found his college degree in a Linux magazine 'Ruby special edition'.
Every-f**kin thing about that Firebase is loose end. I read all Medium articles, your utterly soporific documentation that never ends, I am actually running the flutter plugin example firebase_messaging. Nothing works or is referenced correctly: nothing. You really go blind eyes in life... you guys; right? Oh, there is a flimsy workaround in the 100th post under the Github issue number 10 thousand... lets close the crash report. If I did not change 50 meaningless lines in gradle-what-not files to make your brick-of-puke to work, I did not changed a single one.
I dream of you, looking at all those nonsense config files, with cross side eyes and some small but constant sweat, sweat that stinks piss btw, leaving your eyes because you see the end, the absolute total fuckup coming. The day where all that thick stinky shit will become beyond salvation; blurred by infinite uncontrolled and skewed complexity; your creation, your pathetic brain exposed for us all.
For sure I am not the first one to complain... your whole thing, from the first to last quark that constitute it, is irrelevant; a never ending pile of non sense. Someone with all the world contained sabotage determination would not have done lower. Thank you for making me loose hours down deep your shit show. So appreciated.
The setup is: servers, your crap-as-a-service and some mobile devices. For Christ sake, sending 100 bytes as a little [ beep beep + 'hello kitty' ] is not fucking rocket science. Yet you fuckin push it to be a grinding task ... for eternity!!!
You know what, you should invent and require another, new, useless key-value called 'Registration API Key Plugin ID Service' that we have to generate and sync on two machines, everyday, using something obscure shit like a 'Gradle terminal'. Maybe also you could deprecate another key, rename another one to make things worst and I propose to choose a new hash function that we have to compile ourselves. A good candidate would be a C buggy source code from some random Github hacker... who has injected some platform dependent SIMD code (he works on PowerPC and have not test on x64); you know, the guy you admire because he is so much more lowlife that you and has all the Pokemon on his desk. Well that guy just finished a really really rapid hash function... over GPU in a server less fashion... we have an API for it. Every new user will gain 3ms for every new key. WOW, Imagine the gain over millions of users!!! Push that in the official pipe fucktard!.. What are you waiting for? Wait, no, change the whole service name and infrastructure. Move everything to CLSG (cloud lambda service ... by Google); that is it, brilliant!
And Oh, yeah, to secure the whole void, bury the doc for the new hash under 3000 words, lost between v2, v1 and some other deprecated doc that also have 3000 and are still first result on Google. Finally I think about it, let go the doc, fuck it... a tutorial, for 'weak ass' right.
One last thing, rewrite all your tech in the latest new in house language, split everything in 'femto services' => ( one assembly operation by OS process ) and finally cramp all those in containers... Agile, for sure it has to be Agile. Users will really appreciate the improvements of your mandatory service. -
The rear ducking continues. We've built a reliable translator in the dumbest fucking way possible, it's just lovely. I simply reused the structure for feeding data to the VM assembler, an array of arrays, where there's one array of (ins [args]) per node in the parse tree.
It's nice because nodes can be solved out of order without affecting the actual sequence in which the instructions are output. And if one statement (node) equals multiple instructions, you just push multiple entries to the corresponding array, or push nothing if you need to output nothing. Easy as goblin pie.
This is enough to convert an input language to the assembly-like intermediate representation we use for the virtual machine. So then there's doing it backwards: walk the same array of arrays, and map those virtual instructions to a physical architechture. I guess I could do the encoding to native binary myself, it'd certainly be interesting to try, but I'm burnt-out already so I'll just use fasm for now.
Initial test: wrote a test program in my own stupid language, ran the translator, dump output to file, assemble that with fasm, run with r2 -d.
Crashes? No.
Runs fine? Yes and no.
For fuck's sake, I don't have syscalls. Mainly because the VM doesn't have an operating system, lmao. I was testing virtual programs by just freezing state, terminating, then dumping the fucking registers and stack to the console, we have no I/O to speak of. Not even a real 'exit', VM handles that by reading a return value every step like a mentally damaged son of a bitch.
So anyway, I manually paste the linux mambo, you know:
mov rax,60
mov rdi,0
syscall
And NOW our program can end execution without crashing.
Okay then, so does the test code work correctly?
** DRUM ROLL **
Yes.
Ladies and gentlemen, mother fucking PESO is now a compiled language, and going forward I will be expectantly receiving your marriage proposals for reviewing. Oh, but not so fast, we still need a frontend...
Well, we'll handle that in the next few days. I'm just glad to be *nearly* finished with this fucking compiler, I want nothing to do with anything else ever, but we know that's not going to happen, so Lord please end my pain.
No sponsor as this rant has been paid for by tax evasion. -
So far I've been pretty lucky... except for the code some of my professors at uni used in their assignments. A couple of them had this horrid habit of giving you a horribly-written, out-of-date (we're talking these chuckle heads used the same code for years on end and wondered why it didn't work on new versions of Java), messy source file with "fill in the blanks" sections like it was some kind of Java Mad Libs book. One of them had an entire jarchive of data structures we were required to use that he'd written in the '90s and NEVER UPDATED. Another one had a script he'd written for his own specialized assembly macro preprocessor that he'd been using without update for who even knows how long. Now, we were using one of those goofy virtual machines with its own simplified assembly language, and we were on the fourth version of the program. This guy'd written his macro processor in Java for the second version, never updated his Java source, only provided a barely-working .bat script for running it, even though the department's official preference was a *nix environment, and implemented this horrid "pretty-printer" that had a regrettable little habit of eating code. You heard that right. You'd run build.bat and it'd expand your macros then send it over to the pretty-printer which would very infrequently just replace the existing program file with an empty file. When we brought it to his attention, he goes "...huh. never happened to me." and proceeded to use the very same set of programs for the next three semesters, even when the assembly simulator was updated again. I heard wails of anguish from the poor sad souls that came after me as their macro processor created program files with deprecated operations, their pretty printer printed out beautiful, perfectly-organized empty files, and the professor responded to every second of a student begging for an updated version with "...huh. never happened to me." I never saw a single bug reported to either of those professors even acknowledged, let alone fixed. Some of the Java Mad Libs were the same ones they'd started using when they first switched the curriculum from Ada to Java. Thankfully after my first year I escaped into the bliss of the next three years, which were full of *nix and C and beauty.
-
!rant
Ok so I'm about to start working on an OS but I am going to run through a few tutorials to get the base systems down then I'll incorporate a interpreter for BASIC and my custom scripting language.
Just curious if anyone can point me in the direction of a few well written tutorials that will explain the systems being used. (I want to use Assembly and C only btw, but am open to others)
I only have 1 decent tutorial but it's older and complete (https://github.com/cfenollosa/...)3 -
everything is going as planned! :)
Learned Rust Lang. i loved it (that doesn't mean i am done learning na? No! never stop)
new language i could do game memory hacking in without worrying about C++ memory leaks or issues. it also compiles to assembly! another of my favorite languages!
(i use rust for game development and other stuff)
i am not leaving C / C++ though that would be harsh!,
i abandoned javascript for react and typescript.
to be honest the developer just made javascript and left us with a [object Object]
finished learning the android java api so im basically set anything i want to make i can just go on my pc, listen to music and write it out in a couple of days.
well phazor what are you going to do now?!
i will code till i am old.
i will leave my mark like a shid that made its skid in the bowl :)5 -
Top 12 C# Programming Tips & Tricks
Programming can be described as the process which leads a computing problem from its original formulation, to an executable computer program. This process involves activities such as developing understanding, analysis, generating algorithms, verification of essentials of algorithms - including their accuracy and resources utilization - and coding of algorithms in the proposed programming language. The source code can be written in one or more programming languages. The purpose of programming is to find a series of instructions that can automate solving of specific problems, or performing a particular task. Programming needs competence in various subjects including formal logic, understanding the application, and specialized algorithms.
1. Write Unit Test for Non-Public Methods
Many developers do not write unit test methods for non-public assemblies. This is because they are invisible to the test project. C# enables one to enhance visibility between the assembly internals and other assemblies. The trick is to include //Make the internals visible to the test assembly [assembly: InternalsVisibleTo("MyTestAssembly")] in the AssemblyInfo.cs file.
2. Tuples
Many developers build a POCO class in order to return multiple values from a method. Tuples are initiated in .NET Framework 4.0.
3. Do not bother with Temporary Collections, Use Yield instead
A temporary list that holds salvaged and returned items may be created when developers want to pick items from a collection.
In order to prevent the temporary collection from being used, developers can use yield. Yield gives out results according to the result set enumeration.
Developers also have the option of using LINQ.
4. Making a retirement announcement
Developers who own re-distributable components and probably want to detract a method in the near future, can embellish it with the outdated feature to connect it with the clients
[Obsolete("This method will be deprecated soon. You could use XYZ alternatively.")]
Upon compilation, a client gets a warning upon with the message. To fail a client build that is using the detracted method, pass the additional Boolean parameter as True.
[Obsolete("This method is deprecated. You could use XYZ alternatively.", true)]
5. Deferred Execution While Writing LINQ Queries
When a LINQ query is written in .NET, it can only perform the query when the LINQ result is approached. The occurrence of LINQ is known as deferred execution. Developers should understand that in every result set approach, the query gets executed over and over. In order to prevent a repetition of the execution, change the LINQ result to List after execution. Below is an example
public void MyComponentLegacyMethod(List<int> masterCollection)
6. Explicit keyword conversions for business entities
Utilize the explicit keyword to describe the alteration of one business entity to another. The alteration method is conjured once the alteration is applied in code
7. Absorbing the Exact Stack Trace
In the catch block of a C# program, if an exception is thrown as shown below and probably a fault has occurred in the method ConnectDatabase, the thrown exception stack trace only indicates the fault has happened in the method RunDataOperation
8. Enum Flags Attribute
Using flags attribute to decorate the enum in C# enables it as bit fields. This enables developers to collect the enum values. One can use the following C# code.
he output for this code will be “BlackMamba, CottonMouth, Wiper”. When the flags attribute is removed, the output will remain 14.
9. Implementing the Base Type for a Generic Type
When developers want to enforce the generic type provided in a generic class such that it will be able to inherit from a particular interface
10. Using Property as IEnumerable doesn’t make it Read-only
When an IEnumerable property gets exposed in a created class
This code modifies the list and gives it a new name. In order to avoid this, add AsReadOnly as opposed to AsEnumerable.
11. Data Type Conversion
More often than not, developers have to alter data types for different reasons. For example, converting a set value decimal variable to an int or Integer
Source: https://freelancer.com/community/...2 -
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
Damn lots of you knew this shit before turning of age.
I didn't code a single line until I went to college.
I tried to, but it was just too fucking complicated and I didn't understand a thing. Tried to grasp how to use some tools like Unity or an Adventure Maker of sorts and something called Flix for Flash games. Didn't understand shit.
I decided to study systems engineering due to a career aptitude test I took hoping somehow that way I could learn sthg.
First thing I was taught was bash.
When I realised I already knew enough to code a whole text adventure from scratch with such a simple language I felt really hyped.
Always loved text and graphic adventures.
Afterwards I was taught the Z80 assembly language and how CPU registers worked and it blew my fucking mind.
That was the first half-year.
Then I was taught C. And boy was it hard. Didn't get how memory was being handled until the very end.
I happened to be one of the few passing a stupidly complicated semifinal test with triple indirection pointers.
That felt goood.
Learning other languages afterwards was a piece of cake. C#, Java, X86 assembly, C++...
It was a hard door to open. Fucking heavy. But now nothing seems black magic anymore and boy isn't that something to be proud of! :D -
I know this is too late to ask this question, but am a final year computer science student, average in all core subjects with 0 knowledge of web development (except a few html tags, but not enough to make a wikipedia like website) or other professional streams.
I know java and python enough to make oop classes and understand code written in them.
Should i
A)study more about web dev/ml-ai/testing/other "professional" stuff
B) learn more and strengthen my core subjects , like operating system, algorithms, data structures, etc or
C) learn another core language like C/c++/assembly?27 -
Typically every computer science major begins with either C C# C++ java or python , creating so much abstraction from the hardware which just loads your mind with questions that remain unanswered.When ever i program something i always think of how the under lying stuff is working.They never explain how and where software meets the hardware.Why are they keeping students away from the hardware. I think a cs graduate without knowing the underpinning of a computer should not be considered a cs graduate as opposed to being a software engineer a computer science major relates to everything that is a computer that includes the theoretical stuff and a little bit know how of computer hardware. Instead of teaching this stuff and assembly as a language in the first semester they teach you java or C++. Could not speculate on why this is so.11
-
I have failed my computer organization and architecture module because i didn't understand assembly language.
Anyone with links to the best x86 assembly programming please share. -
I learned to program in community college where they had us learn C# but I honestly didn't catch the programming bug until we had to do assembly language. I just found it fun to break what I needed to do into 's,mall steps and at the end I get to see the result of all that Work!
-
Whenever I rant about JavaScript and it's terrible way of doing things differently and totally illogical in the way real programmers would do things versus webdev-scriptkiddies...
Whenever I laugh about these engineers who can only 'code' in Matlab...
Whenever I hear people consider configuring (of stuff like WordPress or RGB-Keyboard-Lights etc.) as 'programming'...
I wonder, if I'm just like the 'Real Programmers' back in 1983 who truly considered Fortran or Assembly to be much more superior than Pascal and someone who coded in the latter or even used a simple OS like UNIX couldn't get accepted as a programmer.
Found that old article about "Real Programmers".
It's worth a read.
http://pbm.com/~lindahl/...
Just consider someone writing modern computer programs without libraries, ifs, for loops and only gotos by hand from top to bottom...
Some day I want to start some modern project everyone else would do in some random modern scripting language and hack it down in assembly just for fun and to tell people, I did it. So I could call myself a Real Programmer too.2