7

I don't think we need to worry about judgement day anytime soon... If anything, we should be worried they miscalculate where the ICBMs are going to land...

Comments
  • 1
    That looks like chatgpt and no that is not yet reliable.

    It so far has no way to verify an answer and can be wrong due to bad training data or edge cases.

    I am pretty sure they do mot use it for important calculations ;)
  • 1
    Yes, you can't expect mathematical correctness (or any for that matter) from a prediction engine Math has infinite numbers, but the data is finite, written by humans and only written on limited subjects, so the rest is literally just predicting the next word in the sequence, not any kind of actual reasoning.

    As I said before, prediction engines can only get as so far... They are great at pretending to be human, but bad at both being a human and being a computer.

    it will probably get better with time, but there's a limit to what you can do just by taking N - M tokens of text and predicting M + 1, no matter how many parameters you have... AGIs are a bullshit concept anyway...
  • 0
    That's because you are using a potato not the most advanced AI. You can be damn sure, there are better AI already being developed that is trade secret especially at defense or intellegence level,
  • 0
    people were landing explosive devices on the moon using things that today wouldn't even qualify as potatoes - computers in the sixties had a few thousand transistors, not a few dozen million like today.
    It is not a matter of faulty tech calculating poorly by itself, it is faulty humans using lazy sourcing to do their half-assed jobs.
    So future search tools could teach future ICBM-VR Blockchain programmers that Pi equals 4.20 and we all go boom.
Add Comment