Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
Makes sense. Just google for what a modern aircraft cockpit looks like and imagine that you'll need to interact with all of that stuff, especially in a crisis.
Like the famous Qantas flight 32 where one of the engines of an A380 exploded in flight. Lots of damage from the shrapnels, hydraulics lost, communication lines lost, the flight warning system warning just about everything and then some. Also remember that Airbus planes have fly-by-wire, via electronic lines, and lots of them were broken.
The pilot and his crew suddenly had something that was radically different from a regular A380, and they had to figure out what even worked, and how to fly this thing. You can bet they were under high stress when interacting with the computer systems - but they managed to do a miracle and land the aircraft without injured people.
The other extreme was Boeing's 737-Max where the interaction didn't work out that well, and hundreds were killed. -
Oh boy that class sucked.
My favorite part about the class was how chill the professor was and the fact that doing the bare minimum got me an 85 final grade.
I basically sat in the back of the class and played flash games or browsed Reddit -
NoMad141765yTotally understand you. Sometimes the author of the paper has word limits so he waters down the content, and sometimes it's more of a literature writing than scientific writing, because they make it so full of words that are rarely used or have multiple meanings. Like, am I supposed to understand this paper or feel it? It's very beautifully written, but on the first scan it doesn't mean crap.
From another perspective, get used to that because academia is full of self-serving egotistical morons who just want to feel superior. You have to simplify stuff they write to make sense out of it all. -
@NoMad i have found pdfs of my text books and matched them with the syllabus. There are 4 units, and each topic in every unit is a 40-60 page chapter of the book.
So to complete whole syllabus i have to read 800 pages and around 20 chapters.
I don't even read novels that long 😠-
@Stuxnet the good / bad part is that we are not having classes of any subject. We just go for exams, preparing minutes before the test
-
Human computer interaction should really teach the fundamental behind UX. UX is more about design and software solutions but HCI is more about hardware, software and human psychology. I mean, that's the reason we moved further from menus and submenus and got into more easy and intuitive way of interacting.
I believe I am correct about UX and HCI and not mixed them. Please reply if mistaken, devranters. -
Also college teaches about the architecture behind it all so you can get what is happening in a more fundamental way, not treat it like ma~gic and not knowing its low level design.
It certainly fails when it's all theory and not showing implementation.
aagh fuck college subjects. over my last 4 years and 7 sems in college, i must have said this many times : fuck college subjects. But Later i realize that if not anything, they are useful in government/private exams and interviews.
But Human computer Interaction? WHAT THE FUCK IS WRONG WITH THIS SUBJECT???
This has a human in it, a comp in it, and interaction in it: sounds like a cool subject to gain some robotics/ai designing info. But its syllabus, and the info available on the net , is worse than that weird alienoid hentai porn you watched one night( I know you did).
Like, here is a para from the research paper am reading, try to figure out even if its english is correct or not:
============================
Looking back over the history of HCI publications, we can see how our community has broadened intellectually from its original roots in engineering research and, later, cognitive science. The official title of
the central conference in HCI is “Conference on Human Factors in Computing Systems” even though we usually call it “CHI”. Human factors for interaction originated in the desire to evaluate whether pilots
could make error-free use of the increasingly complex control systems of their planes under normal conditions and under conditions of stress. It was, in origin, a-theoretic and entirely pragmatic. The conference and field still reflects these roots not only in its name but also in the occasional use of simple performance metrics.
However, as Grudin (2005) documents, CHI is more dominated by a second wave brought by the cognitive revolution. HCI adopted its own amalgam of cognitive science ideas centrally captured in Card, Moran & Newell (1983), oriented around the idea that human information processing is deeply analogous to computational signal processing, and that the primary computer-human interaction task is enabling communication between the machine and the person. This cognitive-revolution-influenced approach to humans and technology is what we usually think of when we refer to the HCI field, and particularly that represented at the CHI conference. As we will argue below, this central idea has deeply informed the ways our field conceives of design and evaluation.
The value of the space opened up by these two paradigms is undeniable. Yet one consequence of the dominance of these two paradigms is the difficulty of addressing the phenomena that these paradigms mark as marginal.
=============================
rant
what the fuck is this subject
what does it want
hci