News:

The forum has been updated to SMF (2.1.3)!
Please be patient as we work to polish up the place and update features as we can.

Main Menu

Tech Companies are saying goodbye to Moore's Law

Started by Menaus, 08, October, 2016, 11:49:08 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Menaus

It seems that tech companies aren't going to focus on doubling transistor numbers every 2 years.

To me, this is exciting. Since computers began tech companies had the choice between investing in various ways to improve their hardware. Obviously up until now the most profitable has been reducing die sizes. Now that isn't the case anymore, but it doesn't mean that hardware won't improve. Instead it means that tech companies will focus on other ways, like improving architecture, or rethinking the ideas that tell them what is most efficient. Economically, the rising costs for reducing die size mean that the rest of the market values the resources used to reduce die sizes more if they are put towards other uses. I think this is the main factor for abandoning Moore's law more than technological limitations. What this tells me is that there is a potentiality for entire new tech companies to overtake Intel or Nvidia. Since they have focused mainly on reducing die sizes, they're not as good as they could be in other areas. Either they have to get better, or another new company with new ideas is going to overtake them completely. Some people might consider this a little far-fetched, but it happened to IBM so why not now?

Anyone else have thoughts on this?
"You state that I have misinterpreted my results, and it looks as though you believe my views to be unsound. Your arguments are those of an eminent scholar. I was myself a fair scholar. For years I pondered, so to speak, day and night over books, and filled my head with sound views–very sound ones, indeed—those of others. But I could no[t] get to practical results. I then began to work and think independently. Gradually my views became unsound, but they conducted me to some sound results." - Nikola Tesla

Daddy Poi's Oily Gorillas

#1
:D Funny how you are searching the same things I did a long time ago (and still research).... - Although, I may not have seen that specific page, though?

It is interesting stuff...

~Sept. 27th, I read something on Chaos theory....
https://www.engadget.com/2016/09/26/researchers-think-chaos-theory-can-get-us-past-moores-law/
http://www.digitaltrends.com/computing/transistor-design-chaos-theory/
Golden Sun Docs: Broken Seal - The Lost Age - Dark Dawn | Mario Sports Docs: Mario Golf & Mario Tennis | Misc. Docs
Refer to Yoshi's Lighthouse for any M&L hacking needs...

Sometimes I like to compare apples to oranges. (Figuratively) ... They are both fruits, but which one would you eat more? (If taken literally, I'd probably choose apples.)
Maybe it is over-analyzing, but it doesn't mean the information is useless.


The only GS Discord servers with significance are:
Golden Sun Hacking Community
GS Speedrunning
/r/Golden Sun
GS United Nations
Temple of Kraden

Can you believe how small the Golden Sun Community is?

2+2=5 Don't believe me? Those are rounded decimal numbers. Take that, flat earth theorists! :)

charon the ferryman

To be brutally honest, I'm actually more excited to see software improvements rather than hardware. Hardware is at such a point now that as software developers we're able to abstract things more and more and produce more and more complex systems. For example, my expertise in blind accessibility would be pretty much impossible back in the 90s, but in the last 10 or so years, hardware has been more than adequate to actually explore these realms. The problem is that it's literally uncharted territory and you kind of have to be a trailblazer, but there are literally hundreds of fields that need this sort of development.

One of the major problems with hardware is memory addressing, which basically means that the ability to handle large memory hardware components grows to be more difficult as we have to update our addressing schemes. Furthermore it's difficult to make sure that programs are still cross-compatible in these different schemes (which is why 16bit software has such a struggle running on 64bit operating systems). In addition there's also the problem with trying to shrink more and more of these components on a smaller and smaller scale - it's getting to the point where the components are reaching their limit in their current design and a design revolution is needed to go even further.

One possibility is Quantum Computing; while I don't see it necessarily being in everyone's hands in 10 years or something like that, I think it will likely be used with large computers like mainframes or even potentially large servers, because of its ability to compute many things at once. However, the biggest problem there is because the modelling scheme is different, programming for quantum computers has turned out to be quite difficult.

However, I believe right now the best utilization of most tech's time is in software. There's literally huge amounts of uncharted territory there. Blind accessibility is just one example of the thousands of fields that are in need of growth. You know about how you hear about some random dude who comes up with some software solution and makes a @#$% ton of money? That's how it happens and that's the direction tech is really taking.

Luna_blade

yeah software has lots of uncharted space.

Why is addressing memory a problem? As long as the address fits in a word, there should be no problem.

QuoteOne possibility is Quantum Computing; while I don't see it necessarily being in everyone's hands in 10 years or something like that, I think it will likely be used with large computers like mainframes or even potentially large servers, because of its ability to compute many things at once. However, the biggest problem there is because the modelling scheme is different, programming for quantum computers has turned out to be quite difficult.
I'm sure they will find something new, but I don't think it is quantum computing.
Writing programs for it is indeed quite difficult. From what I read, chances are that QC will not even be faster compared to traditional computing.

I think CPU's/software will tend toward more SIMD coming years.
With Big Data and all those databases, it seems like a thing.
"Hear the sounds and melodies
Of rilets flowing down
They're the verlasting songs
Whispering all the time
As a warning that behind some rocks
There's a rigid grap even
Oreads fear the tread"

charon the ferryman

#4
Addressing historically is a problem moreso because of transitioning data between one scheme to another rather than anything to do with the physical components. You have to consider all of the data that is stored on all the systems, but more importantly the software that is used to access this information. While much of this data will still be accessible because of those software being up to date, on a smaller business scale, it would be much more troublesome, considering how many rely on custom software/data. While 64 bit memory might last us for quite a while, I'm not certain if it will meet future needs considering how much our needs have expanded in the last 40 or so years. Compatibility with older work will always be a major issue whenever upgrading to new schemes.

I'm not a huge expert on hardware as I stated before but a common problem is the hardware producer's inability to understand practical uses as well. I mean christ some organizations still use OS/2 lol, which is why Blue Lion is even a thing.

From what I know about quantum computing, it looks like something that could be useful for specific tasks, which is why I think it will be relegated to specialized machines. One huge advantage of quantum computing is the ability to process the same process multiple times through a single qubit. It probably will have the most use in computer security. Connect to a host that handles the operation on a quantum computer and have a response very quickly as opposed to having your local machine calculate it itself.

Luna_blade

"Hear the sounds and melodies
Of rilets flowing down
They're the verlasting songs
Whispering all the time
As a warning that behind some rocks
There's a rigid grap even
Oreads fear the tread"

Daddy Poi's Oily Gorillas

#6
I do agree that Software development has a lot to be gained from... And is usually the biggest reason why some of our apps may be slow (or sometimes not working) when they don't need to be.

However, there are still some things relating to hardware that I find would be cool to see.... Examples:
-Stop computers from overheating / needing a fan. (Moore's Law has likely made this a bigger curse....?)
-Longer battery life / No need for a charger. (e.g. Imagine something similar to built-in solar panels. :P)
-File transfers? (Like duplication.) (Last I remember trying... It took like 10 seconds to transfer a 1 GB file if doing it from a snippet of code.)
Golden Sun Docs: Broken Seal - The Lost Age - Dark Dawn | Mario Sports Docs: Mario Golf & Mario Tennis | Misc. Docs
Refer to Yoshi's Lighthouse for any M&L hacking needs...

Sometimes I like to compare apples to oranges. (Figuratively) ... They are both fruits, but which one would you eat more? (If taken literally, I'd probably choose apples.)
Maybe it is over-analyzing, but it doesn't mean the information is useless.


The only GS Discord servers with significance are:
Golden Sun Hacking Community
GS Speedrunning
/r/Golden Sun
GS United Nations
Temple of Kraden

Can you believe how small the Golden Sun Community is?

2+2=5 Don't believe me? Those are rounded decimal numbers. Take that, flat earth theorists! :)

charon the ferryman

#7
All three of those hardware issues are completely different though and most aren't related to Moore's law at all.

The overheating problem has to do with the design of the components within a computer. I think most likely components will be developed to me more heat resistant over better cooling technologies since computing technology is expanding in its integration (internet of things so to speak). Most obvious example would be a smart phone - having heat resistance offers a lot more utility in different enviorments. I mean, I guess you could argue it's related to moore's law because it's a barrier to improvement but whatever lol.

Battery life is something else entirely and is focused on the technology of the battery as well as how much electricity is used by all of the components. Software can actually help here a lot because better, more efficient software costs far less battery life. Ever played a cheap game on a phone, and notice how much battery life it takes? Well that's because of lack of optimization, which causes more calculations to be processed per second. In fact part of the reason why GBA games were so archaic in comparison to other software titles of the time was this battery life problem, which is becoming more and more relevant as software becomes more accessible to build for these mobile devices. Back 15 years ago we needed to squeeze every last bit that we could get so that we didn't have the batteries fart away in an hour or two, because nobody wants to keep replacing a ton of batteries all the time.

File transfers are an issue with file I/O, basically writing to secondary memory is much slower but more permanent than writing/reading from RAM, which improvement to our memory capabilities won't really do anything for. If the disk head reader is too slow, so will the IO operation. In fact I think this is more of a bottleneck than just moore's law slowing down/not being followed. Its true that better code can make this go faster but this is more of an issue of how the IO operations are coded in the first place - for example something like Java is much slower because of its interpreted, abstracted nature, in comparison to a compiled language. Either way, it all still relies on that disk head being able to read quickly, and if the technology isn't there, it's not there.

One of the important things to note is that I think one of the most important future developments of technology involves the integration of different hardwares with each other. We're already starting to see this with things like businesses storing their data on huge arrays of servers and/or use of mainframes. The communication between these devices is able to compensate for their inability to cover all bases. I think this concept will expand dramatically in the next few decades of development and be able to compensate for a lot of the issues we have now with hardware limitations. However, I think it will also bring forward a new age of how we handle information because of the huge security changes involved.

For example, one of the most frustrating things about the current state of blind accessibility is that it's more of a "tack on" interface rather than something properly integrated and "flowing" with the rest of software architecture. This makes integration difficult, tedious and oftentimes pushed to the side despite being a requirement in most business end software. People lose their jobs over this @#$%, I know some personally. One of my main design goals is to be able to bridge that gap and essentially use my idea to act as a true translator of information rather than just a tacked on interface that works with current screenreader software, and to integrate the ideas of blind-sighted translation into more mainstream design, because the benefits of doing so not only help blind accessibility but also offer a built-in alternative input/output method if something goes wrong, which is a major problem imho these days. Again this is yet another example of how software can be used to solve problems that exist in hardware.

Daddy Poi's Oily Gorillas

#8
QuoteAll three of those hardware issues are completely different though and most aren't related to Moore's law at all.
Relation to Moore's Law was not a pre-requisite to me listing them. (Mainly establishing hardware points. - I mean, I guess you could think of Hardware as the "profits" and software as the "expenses"... you can choose to optimize one, or both. I like both, because why not. :) Gets pretty exciting when you can get your mandatory expenses to ~0, though.)

QuoteThe overheating problem has to do with the design of the components within a computer. I think most likely components will be developed to me more heat resistant over better cooling technologies since computing technology is expanding in its integration (internet of things so to speak). Most obvious example would be a smart phone - having heat resistance offers a lot more utility in different enviorments. I mean, I guess you could argue it's related to moore's law because it's a barrier to improvement but whatever lol.
I said the Moore's Law part based on memory of what I read a long time ago... But it would be more correct to say what specific process was occurring, most likely.  *Quickly googles, and puts up something random... that may or may not be related to whatever that was, but to simply make a quick point. (e.g. "The doubling has already started to falter, thanks to the heat that is unavoidably generated when more and more silicon circuitry is jammed into the same small area.")

---
@Battery Life = Yes... The worst part is when you don't have access to the source, unfortunately.... So how do we optimize, then?  Either way, anything less than a day of battery life is far from what I'd want as an end-goal for main-stream computing.

@File transfer = Only mentioned it with hardware in mind/not necessarily Moore's law... - Even the type of hard-drive you have could be factor.... - But then, do we even really need a hard drive if all of our permanent info ends up on the Cloud? And when (some day) internet is easily accessible from anywhere around the world (B/c of something like internet.org.)  (Sure people will find that spooky. - I can double on permanent because of the nature of the internet.)
Golden Sun Docs: Broken Seal - The Lost Age - Dark Dawn | Mario Sports Docs: Mario Golf & Mario Tennis | Misc. Docs
Refer to Yoshi's Lighthouse for any M&L hacking needs...

Sometimes I like to compare apples to oranges. (Figuratively) ... They are both fruits, but which one would you eat more? (If taken literally, I'd probably choose apples.)
Maybe it is over-analyzing, but it doesn't mean the information is useless.


The only GS Discord servers with significance are:
Golden Sun Hacking Community
GS Speedrunning
/r/Golden Sun
GS United Nations
Temple of Kraden

Can you believe how small the Golden Sun Community is?

2+2=5 Don't believe me? Those are rounded decimal numbers. Take that, flat earth theorists! :)

charon the ferryman

#9
Moore's law is pretty much a memory problem at its highest level. Sure, other components are involved with making memory useful (everything ranging from the transistor design to stress resistance to all sorts of other factors) but they're not exclusive to Moore's law and aren't really part of the issue on an abstracted level. The problem with Moore's law now is that we are in a different state of our technological development where memory is no longer as important as implementation, since if you really needed to you can connect a ton of computers together to act as a single unit for most applications.

Like I stated previously, memory was much more important as this tech was developing in the last 40 years because of how limited resources really were. I mean, you've hacked games, and you KNOW that the GBA really does need the memory that it has to do what it needs to do, and you really have to clamp down on how you're building your game or program for the hardware to handle it. This is so little of an issue that these days people can just write an android app in java no problem. It's really crazy to see how technology has grown in just 10-15 years.

All of those problems still exist though as our priorities have changed. In fact, now they present completely new challenges. Probably one of the biggest is the I/O problem you brought up - the I/O bottleneck is a real problem and I think that many more resources will be put towards it than ever before - again, because our priorities have changed.

Along with the other problems you mentioned - they still exist, it's just that now since the problem is something else, our priorities have shifted. The reason why those problems are even problems right now is because we prioritized memory over almost all other parts of hardware structure - think about how quickly memory and storage has grown in comparison to I/O retrieval speeds and you see exactly what I'm talking about.

Dunno what the cloud has to do with any of it because if you're still reading/writing information to disk at any point in the transaction there still needs to be some writing to disk. Usually an end user won't notice it since the transaction occurs off-site but if many people are performing disk reading/writing operations on the source server it can cause serious issues, even potential downtime. Part of the reason why data centers have so many servers is to mitigate this problem. SSDs are becoming more and more financially accessible so this problem might be somewhat mitigated here, but even SSDs have limitations and I think they will continue to see a lot of development as I/O times try to catch up with the processor times.

No need to make an argument over definitions though. To be honest I'm really excited to see what the future holds. Now with the problem regarding Moore's law has become somewhat irrelevant, we can now utilize this huge memory resource to build out our software in ways that have never been used before. Not to say that memory will never be a problem again, but right now, growth is shifting towards implementation, away from just the hardware. It's becoming abundantly clear to me as a developer and as a personal one man Project Manager that these things are still in their infancy in the industry. Another technological revolution is about to begin and we're right on the edge of it.

Menaus

I think quantum computing is mostly a pipe dream. Everyone talks about how great it's going to be, nobody actually makes a working consumer product—which is the only kind of product that matters to the tech industry. I would compare it to the use of DC nearing the turn of the 20th century for transmission purposes. Everyone say's it's going to be great, and most people are trying to make it work... But it's so expensive that it is only feasible as a luxury good for millionaires or researchers. Like DC, the state of quantum computing might change in the future, but I wouldn't bet on it happening anytime soon, and I would guess that we still have a long way to go with good 'ole regular computing before the benefits of switching our whole paradigm of electronics outweigh the costs.
"You state that I have misinterpreted my results, and it looks as though you believe my views to be unsound. Your arguments are those of an eminent scholar. I was myself a fair scholar. For years I pondered, so to speak, day and night over books, and filled my head with sound views–very sound ones, indeed—those of others. But I could no[t] get to practical results. I then began to work and think independently. Gradually my views became unsound, but they conducted me to some sound results." - Nikola Tesla

charon the ferryman

QC will probably end up being used in specific applications and won't likely ever be in the hands of most people directly. The main problem with it is programming an actual program out of it is very challenging as well as the obvious pricing problems. As stated previously though it could be used for certain security applications by being connected to the internet, being used to calculate some sort of value and return it back. The main advantage to QC is the ability to calculate many things at once, so I could definitely see it having application there.

To be brutally honest though I think that the QC will be hooked up to a standard machine that does all the networking lol.