Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "qualcomm"
-
!rant
Has anyone been paying attention to what Google's been up to? Seriously!
1) Fuchsia. An entire OS built from the ground up to replace Linux and run on thin microcontrollers that Linux would bog down — has GNU compilers & Dart support baked in.
2) Flutter. It's like React Native but with Dart and more components available. Super Alpha, but there's "Flutter Gallery" to see examples.
3) Escher. A GPU-renderer that coincidentally focuses on features that Material UI needs, used with Fuchsia. I can't find screenshots anywhere; unfortunately I tore down my Fuchsia box before trying this out. Be sure to tag me in a screenshot if you get this working!
4) Progressive Web Apps (aka Progress Web APKs). Chrome has an experimental feature to turn Web Apps into hybrid native apps. There's a whole set of documentation for converting and creating apps.
And enough about Google, Microsoft actually had a really cool announcement as well! (hush hush, it's really exciting for once, trust me)...
Qualcomm and Microsoft teamed up to run the full desktop version of Windows 10 on a Snapdragon 820. They go so far as to show off the latest version of x86 dekstop Photoshop with no modifications running with excellent performance. They've announced full support for the upcoming Snapdragon 835, which will be a beast compared to the 820! This is all done by virtualization and interop libraries/runtimes, similar to how Wine runs Windows apps on Linux (but much better compatibility and more runtime complete).
Lastly, (go easy guys, I know how much some of you love Apple) I keep hearing of Apple's top talent going to Tesla. I'm really looking forward to the Tesla Roof and Model 3. It's about time someone pushed for cheap lithium cells for the home (typical AGM just doesn't last) and made panels look attractive!
Tech is exciting, isn't it!?38 -
I use Mac in office for daily dev and tasks, and a Lubuntu laptop for Linux related jobs. My home PC runs Ubuntu. And my personal laptop has a Ubuntu-Windows dual boot. I haven't used the Windows for more than a year. Even I do most of the basic tasks in Terminal, both in Mac and Linux.
Now I need to install a custom ROM in my Android Snapdragon based phone. And all the tools I need runs in Windows only. I don't even want to open that boot partition. 😥
Any help on this? It would be better if I can get something which runs on Linux.15 -
Google, Microsoft, IBM, Qualcomm, Intel formed one team. They betrayed to the world. They are ruling the world.
I am not joking..
.
.
.
.
.
.
Then I woke up.
Damn it :-(2 -
The default USB voltage hould have been specified to 6 instead of 5 volts.
Six (6) volts would allow for longer cables than five (5) volts do, since the spare voltage compensates for the resistance of cables. This is even more crucial for USB hubs. USB hubs are highly dependable upon these days due to laptop vendors dropping the number of USB ports down to two or even one. I am looking at you, Medion.
If several devices are connected to a USB hub, the voltage can quickly drop below 4.5 volts due to the resistance between the USB hub ports and the computer's USB port, causing some devices to restart themselves even if the computer's USB port is not over capacity. If it were over capacity, it would just regulate down its output voltage to prevent overcurrent.
Lithium-ion batteries need at least 4.3 volts arriving at the battery terminals to fully charge, and mobile devices are typically not equipped with a boost converter. Even if they were, they are rather inefficient, meaning they would produce significant heat and waste a power bank's energy. Other USB devices such as flash drives and peripherals might power off below 4.5 volts. However, 6 volts have solid 1.7 volts of margin to 4.3 volts, more than twice the margin of 0.7 volts that 5 volts have. On the way from the power supply to the end device, the voltage has to pass several barriers which weaken it, including the cable, connector endings, and the end device's internals such as the charging controller.
Sure, there are quick charging standards such as by Qualcomm and MediaTek which support elevating voltages to nine (9), twelve (12), and even twenty (20) volts. However, they require support by both the charger and mobile device. If six (6) volts were the default USB voltage, all devices would have been designed to accept this voltage, and longer cables could have been used anywhere. Obviously, all USB devices should be able to run on five volts as well.
Six volts would have been more stable, flexible, and reliable.14 -
Career fair update:
At the fair I bee-lined right to a company I'd talked to at a hackathon before. The person I talked to there was extremely enthusiastic, and called me a "really strong candidate," and even talked about providing housing for the summer near the workplace!
I'm contrast, AMD's interviewer looked extremely uninterested. Pretty sure they just grabbed some random engineer who didn't want to be there.
Microsoft had a line so long that the fair ended before most of the people even got to talk to Microsoft. Needless to say, I didn't even bother.
Qualcomm seemed cool, that went alright.
Overall, really happy with how the fair turned out, and really excited with how likely this job looks for me.1 -
CSRMESH QUALCOMM IS THE WORST!
I spent months still confused what their source code actually serves, it is so dirty and not even specific. They never update.
I should just go for Nordic instead.3 -
TL;DR; I need your advice regd. a new workhorse of a laptop and ARM/MS Surface10/Laptop6 for this purpose
So my hi-end dell XPS (9350) keeps annoying me with its screen flickering. And it's an 8 year old ultrabook with 16G of RAM that I'm using extensively for development, devops, researching and whatnot. 16GB RAM is also becoming...not enough for all of it.
So I'm passively looking for an upgrade. I like the 13" profile (ultrabook style) and battery life, so I'd like to stay away from gaming laptops.
There have been talks about ARM being the new thing. I always saw ARM as a consumer-grade CPU arch (browsing, movies, music, docs, etc.), but the internet says that the new MS Surface devices will have ARM/Qualcomm built in and can compete with MB Pro in terms of performance (ref.: https://windowscentral.com/hardware...) and they are allegedly released this spring.
I'm not much of a hardware person, I prefer staying on the logical level of things, so I want to ask you, people smarter than me, what do you think? Is it a feasible upgrade for an XPS13 (i7 Skylake/16G RAM/4k touch)? I'll be running code and image builds A LOT, using JetBrains IDEs and doing similar resource-intensive tasks. I don't care at all about GPUs - I don't use them (integrated graphics has always been sufficient).
What else should I consider?
Any alternatives?
P.S. while I can't stand Windows, I actually like MS's hardware. They are good at making it.14 -
I have another issue 😬 with one of my Ubuntu laptops. The wifi chip somehow has connection issues. It nearly has no signal. I have to be beside the router to get a connection.
I changed the chip to a new Qualcomm Atheros chip already, but there was no change. I tried a few "speed up your WiFi connection" blog posts. But that also changed nothing.
Do you guys have any advice? :-\26 -
Some of you defended samsung galaxy smartphones, but you’re just lucky to get the right one. In some countries (most of them in fact) including Russia we only have samsungs on exynos cpu, that shit overheats and throttles itself down even when you’re watching YouTube. Lte quality is also garbage, s20 ultra literally loses it on arbitrary grounds while iPhone connection quality is flawless.
If you’re happy with your Samsung, chances are it has a Qualcomm soc, not exynos. Qualcomm ones sold in USA for example.
There is no obvious way to tell what soc is there when you buy Samsung. You’re lucky if it’s in fine print. You have to install AIDA and read from there what soc you’re about to buy.2 -
CSR MESH library by Qualcomm brainfxxxed me for a month already. My company's profile can just use Nordic instead.
-
!rant
I don't like how the hardware industry is so far ahead than the software industry. Almost all new hardware invented these days are a massive overkill for any software that is out there.
Qualcomm Snapdragon has 8 Gen 2 chips out but there aren't any android games that need more than Snapdragon 888.
NVIDIA has RTX 4090 out but there aren't any games that need more than RTX 2070 to run with good FPS.
PS5 and Xbox Series X have a very little library of games that can't run on a previous gen console.11