Technology Wall?

FlameChrome

[color=#7fffd4]IDK what to put here[/color]
  • 1,152
    Posts
    8
    Years
    I could be the only one to feel this. But as I see it, for the most part we have hit a technology wall, where tech isnt improving as much as it was in the past. Yes in the recent years there have been improvements, like how Amd ryzen started to where it is today, ray tracing on gpus now. Beyond that though, most improvements are just the best of the best things becoming cheaper, which is nice. What yall think? Have we hit basically a wall and not seeing much improvements or am i just blind.
     
    I agree, its weird how people can't seem to get this, the passage of time makes this clear. A decade ago people basically used the same everyday technology as today.
    In the world circa 2000 my mom had a beeper.
    In 1990 normal people didn't have the internet.
    In 1980 satellite communication wasn't normalized for the populace.
    In 1970 normal people did not have access to computers
    ect

    each decade before had huge leaps in technology, its platued
     
    I agree, its weird how people can't seem to get this, the passage of time makes this clear. A decade ago people basically used the same everyday technology as today.
    In the world circa 2000 my mom had a beeper.
    In 1990 normal people didn't have the internet.
    In 1980 satellite communication wasn't normalized for the populace.
    In 1970 normal people did not have access to computers
    ect

    each decade before had huge leaps in technology, its platued

    now we just have our standards pretty much, light weight but powerful at times laptops, desktops that can do a little bit of 8k, but usually 4k or 1080p is what people go with. Loads of people now see so much ram and storage as "too much" (though ask those people about their cloud storage and will probably be a lot there, im working on getting a lot of cloud storage lol). just hit this "good enough" standard in other words
     
    i'm waiting for the day wires become 100% obsolete. would be nice to no longer have "wireless" devices be temporarily tethered via charging. and heck, cable management for PCs not being a thing anymore would probs stop others from shying away to build one!
     
    Agreed, it's like smartphones as far as I'm concerned are done. Not in the sense that they are obsolete but the fact they can't go further with them. You look at the iPhones and they've been near the same since the X and it's the same for android phones. I don't think we will see anything new for a long time to come. It's just how it looks they can improve on for now not until something major happens.
     
    While I agree that technology has started to slow down, I don't think it's because of limitations on technology itself, but more because companies are running out of ways to capitalise on the advancements. New technologies are being discovered and trailed, from foldable screens, to self driving cars, but companies haven't found a way to market that to the masses yet. Mostly because of the cost factor.
    I say give it a couple of years and we'll see apple release the brand new "innovative" iPhlip or something and people will start to believe that we're living in the future again.
     
    i'm waiting for the day wires become 100% obsolete. would be nice to no longer have "wireless" devices be temporarily tethered via charging. and heck, cable management for PCs not being a thing anymore would probs stop others from shying away to build one!
    Albeit limited, wireless charging is becoming a thing for mobile devices, but the adaptation has been slow.
     
    I wouldn't personally say things are slowing down. It's just that a lot of the improvements are ones that the average consumer really pays attention to.

    Hardware has slowed down a bit currently because of Moore's Law, but that doesn't mean we aren't seeing major changes in the industry.

    Recall that we're about to see a large influx of ARM based CPU's (both Apple and Microsoft are moving to in-house production. We're getting better GPU's because PCIe is getting very, very good. Wireless connectivity speeds are getting crazy good with both 5G and WiFi 6. Display technology is insane right now.

    Moore's Law has definitely stunted things, but improvements in other aspects of technology allow us to keep making what we already do have even better. I'm hoping eventually we'll see a much larger adoption of wireless charging and communications technologies, and perhaps see the innovation that is a 100% screen-to-body ratio for a cellphone here soon. With cameras and fingerprint scanners now being under displays, it's only a matter of time.

    And interesting that you mention ARM processors, I'm curious to see how well the adoption process will be. Considering x86-64 has been around for a very, very long time, switching to a new architecture will absolutely have a lot of growing pains.

    Edit: I should say that it will have a lot of growing pains for many reasons. The biggest problem is the sheer number of computing devices people have now that technology is so much more accessible - this is a good thing and a bad thing at the same time. Back in the mid 2000s when Apple switched from PowerPC processors to Intel processors, they introduced Rosetta (Just like they're now doing from Intel to Apple Silicon with Rosetta 2) to allow PPC applications to run on Intel so that people could switch over, but the transition was easier since computers then were more expensive compared to now, so there were less software and hardware to have to transition. Now, with everyone and their dog having one or multiple PCs, trying to transition to ARM is going to take many several years, as x86-64 applications will be around for a significant amount of time after the initial transition.
     
    Last edited:
    Back
    Top