8
C-sucks
2y

C, C++, and Java are legacy programming languages.

So, for the ones who fear that the language will go away in mere 10-15 years. Chill. These languages will stay forever.

Comments
  • 6
    C and C++ will only be replaced when quantum computing is the default architecture.

    Java will go before that.
  • 2
    @sariel Won't quantum start out as coprocessor though? I thought it could not fit all algos well.
  • 5
    C and C++ will never go away, especially not in embedded or shader programming.
  • 0
  • 3
    @c3r38r170 After a bit of thinking I found no situations in my daily computer usage for which a quantum chip is more suitable than a simple, cheap CPU that simply executes instructions one after another with certainty. Every one of the features of a quantum chip is either a hindrance or irrelevant. I don't think there's much more to discover in terms of personal hardware, maybe a new instruction set that saves the compiler from guessing at the processor's expectations, and a few orders of magnitude of speed and efficiency improvements. Fundamentally though, we've achieved what a user would realistically want and everything to come will be either optimizations for the haphazardly stacked layers of abstraction or targeted at megacorps.
  • 1
    @lbfalvy I am excited for terahertz chips that may be coming. I would also like better graphics in next TES release. I would like to see generic physics libraries for handling model posing in games. To aid it doing realtime animation calcs rather that predefined animation sequences.
  • 0
    @lbfalvy You will need multiple worlds simulation with the next WhatsApp update.
  • 1
    @Demolishun I don't think that's where it will end up, but you're right. As of now, we need a co-processor as a medium to interface with current digital systems.
  • 1
    @lbfalvy "I don't know about you, but these new 'computers' are worthless. Nothing can compete with a good strong mind and endless sheets of paper or chalk boards." --- the 1870s man
  • 0
    @sariel At the dawn of personal computing there were few use cases because the pace of improvement was unforeseen. Now computing is no longer a green field, it's a young industry with decades of bloat. If there were use cases for something with substantially more processing power they would have been developed already because a lot of people anticipate exponential improvement.
  • 3
    @sariel I reckon before Java, tech companies with the base on Java would disappear.

    Because for all the tech companies which rely on Java, there will always be a need for Java developers.
  • 5
    @lbfalvy "At the dawn of personal computing there were few use cases because the pace of improvement was unforseen."

    Thanks for seeing it my way. /s

    Let's reword that a bit...

    "At the dawn of personal quantum computing there were few use cases because the pace of improvement was unforseen."

    There we go.

    Many of the solutions that quantum computing provides do not yet have problems to solve for.

    Your lack of imagination to foresee improvement provides insufficient evidence to say that quantum computing at a personal level is never going to happen. On the contrary, history shows us that any advancement in technology will provide huge strides in evolutionary development once it can be in the hands of the masses.

    I think you're also overestimating where the transistor ceiling is. In the next 5-10 years, we won't have faster processors, we'll have physically bigger ones to handle more threading.

    Unless we discover a new way to create transistors, we can see the ceiling now.
  • 1
    @sariel I think they are getting close to light based cpus. At least that is what I am reading about. One can hope. Heat issues might be reduced if they figure this out.
  • 1
    @lbfalvy The only application worth mentioning is completely breaking all our "one way hard math" crypto that we re using right now - which is why everyone tries to get there.
  • 2
    @Demolishun That won't be anytime soon, given that raw clock frequencies have stalled for a long time now. Not only that there is no material in sight that would be suited, but the wave length at 5GHz is just 6cm, and at 1THz, it would be 200µm - i.e. much smaller than the chip itself.

    Computing is a mature technology. Computers can't do anything actually new that they couldn't already do 20 years ago. That's why the progress since 2000 went into enabling the same applications, but in mobile.
  • 2
    @sariel I don't know how much bigger they'll get, yields will go down too much if they get really big. Personally I believe in a multiple chips/distributed approach to get more processing power. Alternatively looking at ways to decrease computational need/focus on low power
  • 1
    @matt-jd that's kind of what I think will happen. Multiple wafers on a single "chip".

    I remember Intel tried to put the P2 on an ISA slot style card. In an extreme situation I think that we might see those again.

    That or we're going to see mobo standards change to allow more real estate for more cpus.

    But I'm fairly certain we've hit the silicon density threshold.
  • 2
    @sariel That's what already happened with the chiplet design in Ryzens, and while Intel initially mocked that, they're heading the same way.

    It makes sense because having two chiplets instead of one chip twice as large not only makes better use of the round wafer, but in case of a defect, you end up with one intact and one broken chiplet instead of one broken larger chip.

    The interconnect problem is already solved.
Add Comment