Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "compiler design"
-
Sometimes.....
When I want to escape how dull/repetitive/boring the world of web development is. I crack open a nice lil terminal, dust off my gcc/g++ compilers and fuck around in C or C++ till my eyes start to bleed.
I have been fucking around with systems development. Mainly with Linux programming. I have also started to get deeper on game engine design and compiler design....because low level development is where its at.
A man can only fuck around rest apis, css and html and the endless sea of Javascript and other dynamic languages for so long before going crazy.
Eventually.....I would want to code something impressive enough to give me a spot somewhere as a C or C++ developer. I just can't work with web development any longer man. It really is not what I want to do, the fact that I do it(and that I am good at it) is circumstantial more than because I really enjoy it. I really don't12 -
I'm working on a programming language with a "bytecode" interpreter and a compiler that translates source code to said bytecode and... it sort of actually works!
I want to recreate an Erlang-style environment, currently you can write functions, call C++ functions via wrappers, have immutable-only values, and it has no explicit control structure apart from statement sequencing and the if-expression because I want to make it as functional as possible. Next thing on the list is to add a green threads implementation and ability to spawn and send messages to processes.
Still a WIP and heck even design-in-progress.
Now for the rant:
I'm using CMake for building C++ (interpreter) and Stack for Haskell (compiler) and I've been trying to get them to talk to each other for hours because I want CMake to manage the Stack build too and shove all the executables into one place. CMake documentation is weird and Stack isn't too helpful either, so I guess I'll just spend another few hours trying to get Stack to fuckin reveal its build directory to CMake and/or build to a given directory. Ugh.8 -
Someone mentioned Holy C in another thread and I automatically knew they were referencing the language, based on C, and developed by Terry A Davis from Temple OS and Schizophrenic fame.
I legit felt sad for the man, he was obviously a very talented and smart programmer. You removed all the racial slurs, crazy dialogues and biblical stuff that was caused by his mental illness and you were left with a very brilliant and dedicated programmer.
While Hurd (kernel meant to replace Linux) will fucking never see the light of day after years in the making, Terry was able to generate: his own compiler for his own programming language, kernel, drivers, desktop environment, filesystem TODO by himself. I mean, fuck me dude, he even included games of his own design into the damned thing, using very advanced concepts that were present in flight simulators or doom like fps.
It just bothers me so much, the dude would have probably done amazing non-religious things if it were not for his illness.
If you like reading about this sort of thing, check him out, there are a couple of youtube videos by him. Don't be put off by the shit that he spews in some videos, remember, he was saying shit like that out of a very real mental illness.
Oh, and fuck Hurd5 -
Currently I'm working on 3D game engine and making a 3D minesweeper game with it.
I have started creating a compiler not long ago using my own implementation (no Lex no tools nothing just raw algorithms application) to hopefully some day I will be able to make a language that works on top of glsl inside my game engine. I have compilers design class this semester which haven't even started yet and made a lexical analyser generator. I also have another class about geographical information systems which I will be using my engine to create some demos for some 3D rendering techniques like level of details or maybe create something similar to arcgis which we will be using.
Oh man I have many stuff I want to do.
Here is a gif showing the state of my minesweeper game. I clearly lack artistic skills lol. One thing I will be making is to model the sphere as squares not triangles.
Finally I want to mention that I months ago saw someone here at devrant making a voronoi diagrams variant of this which inspired me to make this.
I made long post so
TLDR : having fun reinventing the weel and learning 😀 -
Not exactly wrong there. I got started at 10 with odds and ends for fun.
Took a semester of it in college before switching because the compiler and I didn't get along.
Only to pick it back up via self-study 10 years later because the programmer who came on board for a side project never showed up and I wanted to keep things moving.
10 years later still, I'm now running a web design shop full time. -
From NAND to Tetris..
This book is IMO the best book for those who want to venture to the lower level programming.
This books retrains you’re thinking, teaches you from the bottom up! Not the typical top down approach.
You begin with the idea of Boolean algebra. And the move on to logic gates.. from there you build in VHDL everything you will use later.
Essentially building your own “virtual machine”.. you design the instruction set. Of which you will then write assembly using the instruction set to control the gate you built in VDHL.
THEN you will continue up the abstraction layer and will learn how a compiler works, and then begin written c code that is then compiled down to your assembly of your instructions set to be linked and ran on your virtual machine you built.
All the compiler and other tools are available on the books website. The book is not a book where you copy and paste, run and done.... you kinda have to take the concepts and apply them with this book.
Then once you master this book, take it the extra step and learn more about compilers and write your own compiler with the dragon book or something.
Fantastic book, great philosophy on teaching software.. ground up rather than top down. Love it! It’s Unique book.21 -
Stop teaching people deprecated bulls*it.
I'm taking a "Web Design" course and the teacher wants us to use html attributes and the <font> tag to format pages. He doesn't allow us to use CSS. Says "We'll get to CSS later, right now I'm teaching you HTML". He thought us the <frameset> thing which isn't even supported in HTML5. And of course no <header>, <footer>, <aside> etc.
Same thing in my C++ course. The computers don't even have a C++11 (or newer) compiler. Just an old version of Code::Blocks we're not allowed to update. It does support C++0x so you can still get some of the features, but still.4 -
I just watched a talk given by Ryan Dahl, highlighting what he considers to be some early design mistakes with Node:
- Removed early version of Promises
- Not sandboxed by default
- GYP compiler
- package.json
- node_modules
- require() without extension
- index.js by default
https://youtube.com/watch/...
Also, his new project Deno sounds like Node 2.0. Interesting!4 -
My two cent: Java is fucking terrible for computer science. Why the fuck would you teach somebody such a verbose language with so many unwritten rules?
If you really want your students to learn about computer, why not C? Java has no pointer, no passed by reference, no memory management, a lots of obscure classes structure and design pattern, this shit is garbage. The student will almost never has contact with the compiler, many don't even know of existence of a compiler.
Java is so enterprise focused and just fucked up for educating purpose. And I say it as somebody who (still) uses it as main language.
If you want your students to be productive and learn about software engineering, why not Python? Things are simple in Python can can be done way easier without students becoming code monkeys (assuming they don't use for each task a whole library). I mean java takes who god damn class and an explicitly declared entry point which is btw. fucking verbose to print something into the console.
Fuck Java.17 -
By constantly fucking around with things that interest me. If a topic fascinates me i will either lool for shit around youtube, read proper documentacion or buy specialized books for it.
Most recently it has been compiler design. I wanna write my own language, for testing and learning more than anything.
I dunno, it keeps shit fun and interesting. Now, much of that shit ain't applied to what I do in web. But it does help to keep the mind fresh as well as giving me the chance to eventually invent my own language. Write a large system with it, use it at the institution and have them pay me obscene ammounts of money to maintain it.
It will be like VB6 or VbScript, but with {}s, immutable values by default and no looping, cuz I am evil AF -
At last !!
All projects are done!
Done 3 compiler project (with 3 separate documentation)
And 6 different Algorithms for algorithm design and analysis.(including my own project)
Been coding so hard to finish them ontime and now I made it :)1 -
I believe it is really useful because all of the elements of discipline and perseverance that are required to be effective in the workforce will be tested in one way or another by a higher learning institution. Getting my degree made me little more tolerant of other people and the idea of working with others, it also exposed me to a lot of topics that I was otherwise uninterested and ended up loving. For example, prior to going into uni I was a firm believer that I could and was going to learn all regarding web dev by maaaaaself without the need of a school. I wasn't wrong. And most of you wouldn't be wrong. Buuuuuut what I didn't know is how interesting compiler design was, how systems level development was etc etc. School exposed me to many topics that would have taken me time to get to them otherwise and not just on CS, but on many other fields.
I honestly believe that deciding to NOT go to school and perpetuating the idea that school is not needed in the field of software development ultimately harms our field by making it look like a trade.
Pffft you don't need to pay Johnny his $50dllrs an hour rate! They don't need school to learn that shit! Anyone can do it give him 9.50 and call it a day!<------- that is shit i have heard before.
I also believe that it is funny that people tend to believe that the idea of self learning will put you above and beyond a graduate as if the notion of self learning was sort of a mutually exclusive deal. I mean, congrats on learning about if statements man! I had to spend time out of class self learning discrete math and relearning everything regarding calculus and literally every math topic under the sun(my CS degree was very math oriented) while simultaneously applying those concepts in mathematica, r, python ,Java and cpp as well as making sure our shit lil OS emulation(in C why thank you) worked! Oh and what's that? We have that for next week?
Mind you, I did this while I was already being employed as a web and mobile developer.
Which btw, make sure you don't go to a shit school. ;) it does help in regards to learning the goood shit.7 -
Avoid ACPICA if at all possible. It's one garbage tier cluster fuck of bad design, horrible documentation and downright misleading and wrong code
It's meant to consist of an ASL compiler, disassembler, debugger, dumper, various user space utitilies and a kernel resident OSPM implementation *if* you can figure out what belongs to what. Even just compiling this pile of trash is a mystery in itself. Think you need the source files in source/common? EEEEH, wrong. Well, at least partially since most of them seem to be for the user space stuff..? Other ones *are* needed on the other hand. At least the disassembler and/or debugger and/or dumper components seem to reference them. Not that I could figure out how to compile those anyways. The real path to your goal seems to be to ignore a seemingly arbitrary subset of source and header files until your linker stops complaining
There's also a bunch of configuration defines, some of which *you* define, some defined *for* you, based on again others. Of course most of them do stupid shit. Enabling the debugger automatically enables debug logging. Enabling the disassembler force enables debug allocation tracking... What?
The code itself isn't of much help either. Looking in "os_specific/service_layers" you find what looks to be reference implementations of acpica functions in certain os' like windows and unix. Of course I had a look because AcpiOsReadMemory is supposed to read physical memory and I don't know how I would even implement that. But hey, osunixxf.c (xf for interface... of course) should tell me. I'll let you see for yourself in the attached image. Apparently it does fuck all and just returns AE_OK. No error, no logging, no nothing. Just ok. As you can imagine, AcpiOsWriteMemory doesn't do much more either.
...okay so maybe physical memory accesses aren't actually used and these functions are some sort of relic from past times? Nope! They are absolutely necessary for doing low level device interaction. WTF. So finally I went to the linux source and checked how *they* implemented them, and just as I thought, these functions are anything but no-ops...
...So for what fucking reason do these stupid interface implementations even exist but to purposefully mislead you?? They aren't used for fucking anything! As far as I know Windows doesn't even *use* ACPICA and Linux have their own fork with working implementations... They just sit there, just to tell you how to NOT do it
So that's some of my thoughts about ACPICA. Note that I haven't even used it as a library yet, I just got it to compile and link and it already fucked with me this much.
There's also so much more I didn't mention like that you *have* to modify the acpica source in order to get your own platform header working (else #error) eventhough the docs explicitely instruct you not too but you get the point
Don't use ACPICA if you don't have to. Save your sanity for something that's worth it -
C++99, C++03, C++11 and C++14
I love when your design finally ends up working, looks good and it's running as fast as Usain Bolt, but why the hell does OOP has to be so ugly and clunky in C++? Constructors and copy constructors designs are barf inducing. Yes I am trying to make it as readable and neat as possible but it still looks like shit overall. And related compiler errors are almost always retarded or unhelpful even though I'm used to it now.
I know you will tell me "why are you using those old ass versions?". Well unfortunately in embedded you are stuck with old crap until some envoy of the gods finally up the standards... or if I do it myself for a specific platform.2 -
I just saw that ARM released their design start IP for Cortex M0 for free to the masses , it’s obfuscated verilog code.
I worked on SoC design based on this in college but it took a lot of paper work to get these file but now they are free to download
This is exciting as this makes a open-source community based microcontroller design possible.
Only missing piece here is the verilog compiler they use is not open source .
Has anyone messed around with Cortex M0 DS + ghdl or iverilog. I am about to start a little side project will update more on this19 -
I already built a compiler and an interpreter.
ONLY ON THE THIRD TRY I realize that the hard part is the language design. lets hope Gerlang3.0 will turn out usable lol
specs: https://github.com/MaximilianJugend...
PS: I hate apes2 -
So I figure since I straight up don't care about the Ada community anymore, and my programming focus is languages and language tooling, I'd rant a bit about some stupid things the language did. Necessary disclaimer though, I still really like the language, I just take issue with defense of things that are straight up bad. Just admit at the time it was good, but in hindsight it wasn't. That's okay.
For the many of you unfamiliar, Ada is a high security / mission critical focused language designed in the 80's. So you'd expect it to be pretty damn resilient.
Inheritance is implemented through "tagged records" rather than contained in classes, but dispatching basically works as you'd expect. Only problem is, there's no sealing of these types. So you, always, have to design everything with the assumption that someone can inherit from your type and manipulate it. There's also limited accessibility modifiers and it's not granular, so if you inherit from the type you have access to _everything_ as if they were all protected/friend.
Switch/case statements are only checked that all valid values are handled. Read that carefully. All _valid_ values are handled. You don't need a "default" (what Ada calls "when others" ). Unchecked conversions, view overlays, deserialization, and more can introduce invalid values. The default case is meant to handle this, but Ada just goes "nah you're good bro, you handled everything you said would be passed to me".
Like I alluded to earlier, there's limited accessibility modifiers. It uses sections, which is fine, but not my preference. But it also only has three options and it's bizarre. One is publicly in the specification, just like "public" normally. One is in the "private" part of the specification, but this is actually just "protected/friend". And one is in the implementation, which is the actual" private". Now Ada doesn't use classes, so the accessibility blocks are in the package (namespace). So guess what? Everything in your type has exactly the same visibility! Better hope people don't modify things you wanted to keep hidden.
That brings me to another bad decision. There is no "read-only" protection. Granted this is only a compiler check and can be bypassed, but it still helps prevent a lot of errors. There is const and it works well, better than in most languages I feel. But if you want a field within a record to not be changeable? Yeah too bad.
And if you think properties could fix this? Yeah no. Transparent functions that do validation on superficial fields? Nah.
The community loves to praise the language for being highly resilient and "for serious engineers", but oh my god. These are awful decisions.
Now again there's a lot of reasons why I still like the language, but holy shit does it scare me when I see things like an auto maker switching over to it.
The leading Ada compiler is literally the buggiest compiler I've ever used in my life. The leading Ada IDE is literally the buggiest IDE I've ever used in my life. And they are written in Ada.
Side note: good resilient systems are a byproduct of knowledge, diligence, and discipline, not the tool you used. -
On my University exam of compiler design I got this question
Write difference between pascal and C programming language.1 -
Hardware classes for software dev student?
Hey guys. Currently getting into second year of a 5 year curriculum to get an 'Integrated Master of Computer Engineering & Informatics' Degree here in Greece.
I'm already into software, I'm fooling around with java, go and php, making some games, web services and anything I find interesting in general. Recently, with the logic design class, I started liking hardware stuff (I didn't really like them before).
We're getting to a point where we might have to decide between picking hardware-centered or software centered subjects. I'm thinking that I can probably learn whatever is taught on the software side by myself (with a bit more studying of course), whereas hardware would be more difficult to study alone.
That said, I'm considering picking hardware, but I am skeptical. What do you think? I'll certainly miss out on the concurrent processing, data structure and how-a-compiler-works classes.
What do you think?
P.S. University here is free2 -
The time when I started building my first interpreter. I had no idea about them so I just copied the code from the book but it felt good really good and I learned so much about compiler and interpreter design. I guess copying the code and seeing things connect was the best and badass experience for me.
-
So I'm writing my compiler and I decide to test error handling, see if I'm catching unexpected tokens and whatnot. I try duplicating a semi-colon at the end of a line, for sure it'll give me an error since that's an unexpected token, isn't it? So I run the compiler and... No errors? I start debugging for a few minutes, snoop around, everything seems ok... "Huh, that's weird" and then it dawns on me, a semi-colon only marks the end of a statement. So, technically, it's not an unexpected token if you have an empty statement (which wouldn't break any rules about statements). I decide to try out my theory. I put ;;;;;;;; at the end of a random line in my rust code, hit compile and... it compiles! So that means it is not a bug anymore! I mean, if the big guys that actually know a tad about language design, compilers and all that cool stuff allow it in their languages, why shouldn't it? So I did it, I turned a bug into a feature and now I can go to sleep in peace and stop dreaming about fucking abstract syntax trees (don't mind my kinks >:) ).
Yeah anyways thanks for reading, till next time! Bye!1 -
How can a novel emerging challenger software (written in Rust) take me 4 hours to install (still ongoing)?
Today I have decided to give Pijul a go. Pijul describes itself as a theory-sound alternative to Git, which I have wanted to get away from for a while now, due to various reasons -- many of which I saw Pijul advertise to have solved on design level.
So I set away a day to learn Pijul, today. Well, 4 hours after I sat down -- after a number of hilariously wonky failures of "Rust ecosystem" to do the right thing as I had to install Rust with some shell one-liners those insane wizards recommend for installation process (all in the name of "stability but not stagnation") -- Pijul has now been installing with the blasted `cargo` for an hour now (that's after 3 hours of getting to the point where `cargo install pijul` stopped exploding in my face) -- telling me I only have 40 crates more to install. Are they throttling me, perhaps? I don't care -- I should have been installing Pijul from a repository in accordance with my Linux distribution, or -- at worst -- download a BLOODY COMPILED PROGRAM IMAGE.
What is it with the hipster developers today? Everything they get of tools, they subsume and churn out intricate complexities the likes of which we hadn't seen yesterday. Tell me fellow developers who think installation of your software has to require three and a half novel "installation solutions" to which I can't be arsed to be made privy -- do you think your life today is easier than, I don't know -- wrangling with a Makefile and a C compiler (which today thankfully can do rather good job of standards compliance)?
I mean I wouldn't mind Pijul being written in Rust -- but it turns out Rust's advertised elegancy in practice is wrapped in so much "giftwrap" I feel like what desire I had to learn Rust myself, I'll stear well clear.
Here's an advice for developers in general -- an advice continiously ignored for decades -- stop blowing your original scope of delivery in auxilary packages you think you need to reinvent just because you can or because your mom is out of town! For programming languages like Rust this most certainly entails NOT writing your own package manager, with its own package delivery mechanism that has its own configuration file format and virtual machine to configure dependency resolution or what have you!
You wanted to write a programming language that has novel features you think we need? Fine -- write one and stop there. Watch it grow, and watch people who are busy working on other parts (scopes) of software to integrate your offer.
What a shitshow. Stop smuggling alternative package managers, installers, and discombulators with your actual product -- I only want the latter, I don't want the rest of your damn piping, walls, roof and a cathedral on top of it!
Don't be that guy starting with a pin, and ending up with a fucking diorama miniature of a pig farm in Netherlands. Jesus.7