[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
What's the next step after x86?
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 88
Thread images: 9
File: BE10iQ5.jpg (67 KB, 859x492) Image search: [Google]
BE10iQ5.jpg
67 KB, 859x492
What's the next step after x86?
>>
The Power Architecture using RISC or you know ARM cus it su kul!
>>
x87
>>
Nothing, because x86 is fine.

Now delete this thread.
>>
>>51678043
x420
>>
>>51678086
Tell me about x420.
>>
File: 852419Amiga500_system1.jpg (214 KB, 700x543) Image search: [Google]
852419Amiga500_system1.jpg
214 KB, 700x543
m68k
>>
>>51678094
It blazes all over x86. I got to see an experimental system at the Intel labs. It smoked out the whole lab, but before that happened it was beyond great.
>>
>>51678065
But RISC is better than CISC.
>>
>>51678119
>sublte
>>
>>51678119
Kekked a little
>>
>>51678043
AMD64
>>51678065
No
>>
>>51678043
ARM
It's enough to let normies browse their Facebook or whatever they do and play candy crush once in a while
On a more serious note ARM is getting pretty beefy these days, some SOCs even have hardware decoding for h265. Pretty nice eh? That at <3W
>>
>>51678232
AMD64 is x86.
>>
something low power with multimedia application in mind (mobile phones/tablets)
>>
File: 1429864313714.webm (97 KB, 425x315) Image search: [Google]
1429864313714.webm
97 KB, 425x315
x86-128

Though these CPUs might not exist in our lifetime.
>>
>>51678283
dont GPUs these days have 128 bit CPUs?
>>
>>51678283
By the time 128 bits will be viable CPUs won't be anything like x86.
>>
>>51678283
64 bit CPUs have enough horsepower to handle pretty much everything for the next 50 to 100 years or so. But then we dont know for sure, maybe some hister company like apple will come and make 128 bit CPUs for no reason at all?
>>
>>51678299
GPUs don't have CPUs. The 128-bit thing you hear mentioned is the width of the memory bus on GPUs.
>>
>>51678242
No, AMD64 is x86_64.
>>
>>51678327
>GPUs don't have CPUs
Lol, so then who's processing your fragment shaders huh fag?
>>
>>51678302
RISC was suppose to put x86 CISC on suicide watch but it didn't. In fact the opposite could happen as X5 and X7 cherry trail atom chips make their way onto phones.

I get the feeling x86 may never be replaced like mirrors in rooms. If something does ever replace x86 it sure as hell won't be ARM CPUs.
>>
>>51678316
64 bit values aren't even enough to get sub-millimeter precision for any point in our solar system
128 bit cpus will come soon
>>
>>51678378
>it sure as hell won't be ARM CPUs.
It will be human brains.
>>
>>51678043

x86 is not going to die anytime soon. The AVX-512 extension plus auto-vectorizing compilers are going to be HUUUGE.

Seriously, check out Knights Landing.

So the real question is if x86 is going to kill the GPU.

RISC-V is the future for everything else. ARM may be killed by RISC-V, but it is more likely that ARM will just drop the cost of their licensing into the noise range to compete with free, and the idiots that purchased their stock will get burned. ARM will slowly wither away to nothing, or end up designing customized RISC-V cores.
>>
>>51678368
CUDA or GCN cores which aren't the same as x86 cores. This is why gpu video encoding is dogshit. GPUs are good at processing video graphics not doing complicated computations that CPUs do.
>>
>>51678235
x86 is down to around the same level in power and performance as the top end of ARM. And anyway Intel still has the billions to shovel at cutting-edge fabs. I predict we'll see the current split continue, where phones and such run ARM and computers use x86, with some mixing in the tablet and hybrid areas where they come together.
>>
>>51678383
128 bit CPUs are still being used today fro special purpose machines where a lot of bandwidth is required in each CPU cycle, for time critical devices. The CPU controlling all the sensors and ignition in your car is probably a 128 bit one
>>
>>51678405
GPUs are basically CPUs with loads of cores making them better at doing multithreaded tasks
>>
File: 1440169363375.png (451 KB, 960x540) Image search: [Google]
1440169363375.png
451 KB, 960x540
>>51678392
>Implying x86 won't put human brains on suicide watch as well in the near future.
>>
>>51678420
There are far more differences than just the number of cores, you ignorant faggot.
>>
>>51678416

Um, no. Nobody is using 128 bit pointers yet, certainly not embedded computers.

You guys don't understand what determines the width of the cpu is the addressable bits of memory, not the width of certain non-gp registers.

I am pretty sure the 486 could compute floating point numbers with 80 bit precision, which was normally truncated down to 64 bits. Just in case you were not keeping up, this does not make the 486 an 80 or 64 bit machine.
>>
>>51678420
We can't call them CPUs though. They fail to do many things CPUs can do.
>>
>>51678337
x86_64_42 then
>tfw triple compatible
>>
>>51678422

absolutely not.

Biological brains are like 5 orders of magnitude (that's 10000x) better in energy efficiency per calculation than our best silicon chips.

The most massive supercomputers are about the complexity of a small mammal's brain and consume the energy of a small city.

Several years ago Intel compared its first billion transistor chips to the complexity of a bumble-bee's brain.
>>
>>51678043
POWER or IA64
X86 a shit
>>
>>51678486
You are already replaced
>>
>>51678405
Why are CUDA and GCN cores shit at video encoding?
>>
Dynamic FPGAs where every program is allocated a section and is free to program this section however it likes. Programs can run hundreds of parallel threads if necessary, or make their own instructions, etc. The processor would be able to be completely custom-built for that program's specific task.
>>
x172_128
>>
>>51678431
>implying he meant the only difference was cores
go back to school and learn to read friend :^)
>>
>>51678442
>>I am pretty sure the 486 could compute floating point numbers with 80 bit precision, which was normally truncated down to 64 bits.
fun fact: the 80-bit extended precision format was available all the way back in the original 8087.
>>
>>51678552

This is why Intel purchased Altera. FPGA co-processors are coming to x86.
>>
>>51678502
>IA64
I'm pretty sure modern computer science never did solve the problem of how to write the compilers that that architecture needed and assumed would be available when it was designed.
>>
>>51678486
The technological singularity is estimated to occur around 2040-2050. Moore's law agrees with that as well.

You also have to remember x86 chips have gotten massively power efficient. Today a z8700 14nm 6 watt max tdp Atom chip has 4x the processing power of a P4 90nm ~150 watt max tdp prescott housefire. This efficiency will only keep growing as we find new ways to increase efficiency especially as we get close to 5nm lithography.

That P4 housefire was only 10 years ago too.
>>
>>51678060
This.
When will Intel bring back x87?
>>
>>51678585

It was also built around predication.

You could compute down both sides of a short branch at the same time, then pick the proper results.

Great in theory until you think about all those wasted joules.

(I'm no compiler expert)
>>
>>51678544
Because GPU cores are "dumber" than x86 cores. You could actually use up to ~128 x86 cores and encode video with the {slower} preset and get blazing fast encoding speeds assuming the ram and ssd didn't become a bottlenecks (they probably would to some extent).

I'm not really too sure of the technical aspects of gpu and cpu video encoding but to this day no gpu can encode high quality h264 video the way a CPU can no matter how many GPUs you tossed to encode a video file.
>>
>>51678637
the speculative-execution part of it worked, if I remember right, what didn't was that it dumped a lot of the parallelization problems on the compiler so that it didn't have to deal with them. The idea was that the compiler would order instructions so as to extract instruction level parallelism - like, knowing that instructions A and B don't depend on each other, so they can be put next to each other in the code and issued to two parts of the CPU at once. As it turns out that's effectively impossible to do well at compile-time.
>>
>>51678606

I am pretty sure it won't happen any time soon. Moore's law really ended ~2000. The nm you read about with respect to chip sizes really no longer represent the size of the transistor (they are marketing / generation numbers). Intel is the most truthful and I think it represents some piece of the gate of the chip.

Are things better? hell yes. But progress is slowing, and the end of silicon is nearing.

Think about it this way. All of Moore's law has been about the same thing, shrinking transistors. It can't continue. How many atoms are left across a gate? 10? Leakage is already a huge problem. Costs for fabs is skyrocketing. EUV has been delayed, and delayed again. I think 18" wafers have been canceled.

And the gap between what we can build, and anything biological is astounding; not that we even know how to build something that can really think.

Future=bright, but no singularity in our or our children's lifetime is my call.
>>
>>51678695

I never really tracked IA64. I think that some of its RAS features ended up in normal xeon servers. I vaguely remember it was talked about having a common platform that you could plop either chip into.

No matter what, it must have been an epic (heh,heh) waste of money.
>>
File: 1183618683-1.png (29 KB, 640x480) Image search: [Google]
1183618683-1.png
29 KB, 640x480
amd64 is good enough for next 1000 years
>>
>>51678708
>Are things better? hell yes. But progress is slowing, and the end of silicon is nearing.
We will have to use something else then. Progress never ends.

>Think about it this way. All of Moore's law has been about the same thing, shrinking transistors. It can't continue. How many atoms are left across a gate? 10? Leakage is already a huge problem. Costs for fabs is skyrocketing. EUV has been delayed, and delayed again. I think 18" wafers have been canceled.
True, die shrinks might hit a brick wall at 5nm or 7nm. By then maybe 3D processors will become commercially viable though.

>And the gap between what we can build, and anything biological is astounding; not that we even know how to build something that can really think.
Biological organisms will become obsolete. As soon as CPUs can accurately simulate organic neural networks, real organic brains will be useless and pointless.

>Future=bright, but no singularity in our or our children's lifetime is my call.
The guy who claims the singularity will happen in the next ~30 years is a computer scientist not a crackhead down the street. You should really take things a little serious when scientists make claims of impending doom.
>>
>>51678708
>but no singularity in our or our children's lifetime is my call
You won't know when it happens.
>>
>>51678878
>>Are things better? hell yes. But progress is slowing, and the end of silicon is nearing.
> We will have to use something else then. Progress never ends.

True, but it moves in fits and starts.

>> True, die shrinks might hit a brick wall at 5nm or 7nm. By then maybe 3D processors will become commercially viable though.

The trouble with 3d is you can't really get the heat out from the inner layers. People are working on it, including liquid cooling the chip itself, but this is not a trivial problem.

The next big advance, that will be hitting us in the next year is the integration of several GB of ram chips into the cpu package itself. This will give cpu's GPU like memory bandwidth on decent sized working sets.

> Biological organisms will become obsolete. As soon as CPUs can accurately simulate organic neural networks, real organic brains will be useless and pointless.

You don't understand how far we are from being able to replicate biological computational performance. Not Even Close. We will be bio-engineering full on custom complex lifeforms way before we replicate a brain in silicon or GaAs.

> The guy who claims the singularity will happen in the next ~30 years is a computer scientist not a crackhead down the street. You should really take things a little serious when scientists make claims of impending doom.

Yeah, I think you mean the Sun guy, Kergwell or something.
I have met brilliant old scientists before that think the same way. In the case of the guy I know, he just didn't/does not want to die, and this effects his judgement IMHO.

Biology is one hell of a bitch, and a god damn fine engineer. I'd love to be wrong, but everything I have read and learned is that we are lightyears away from replicating the computational capability of even a single normal human brain. We are only now crawling to Exaflop performance, and biological brains are not just doing a bunch of matrix multiplies.
>>
>>51678043
The future will be x86-64 and AArch64. Everything else will be considered depreciated
>>
>>51678627

You can still use x87 instructions. It's just why would you when SSE2 is guaranteed in every x86-64 CPU?

>>51679165

The word you are looking for is "deprecated".
>>
File: physx-ballchain-feat2[1].jpg (19 KB, 414x257) Image search: [Google]
physx-ballchain-feat2[1].jpg
19 KB, 414x257
>>51679214
> It's just why would you when SSE2 is guaranteed in every x86-64 CPU?
Because that's the way it's meant to be played
>>
>>51678119
kek
>>
>>51678563
>"GPUs are basically CPUs with more cores"
Implications were implied. Perhaps it is you who needs a remedial course on writeen comprehension.
>>
>>51679278
>writeen
"Writeen(tm) for young adults"
>>
Bunch of /g/ fags bitching on general purpose processing units.
Because they are general purpose, significant advances in architecture won't be seen even after ad-hoc solutions are used past there intended purpose.
If you are looking into/for something with such an extensive instruction set some sort of ASIC or derivative thereof is probably more suitable.
>>
X4u
Its a big architechture
>>
>>51678552
I would say this is actually the future. RISC is a late 90's idea of the future.
>>
wasn't the PS2 128 bits
>>
>>51678065
m8, there are 20 year old exploits in x86 processors.
>>
>>51679031

This whole thing about computer power vs biological brain power fascinates me. Do you have any articles or whatever about the subject?

I understand that computers are infinitely better than human brains at things like calculating and basically anything to do with big numbers, but the brain is infinitely better at anything regarding emotions, creativity, understanding/interpretating the world around it, and energy efficiency, right?
>>
File: W10blood.jpg (97 KB, 482x424) Image search: [Google]
W10blood.jpg
97 KB, 482x424
>>51682702
http://scienceblogs.com/developingintelligence/2007/03/27/why-the-brain-is-not-like-a-co/

All we really need to do to replace human brains is simulate the neural networks inside them which isn't as hard as it seems and we could have a real-time simulation of a human brain by 2030. The problem is you have to simulate billions of independently working neurons.

The human brain is better at image and sound recognition. This is why you can tell what someone is saying with a little static in the background or what a photo is about even if saved with 10% jpeg quality. These are all algorithms the brain has that we haven't discovered yet. We will someday though.

Anyway human beings are doomed to be wiped out after 2050 because of the technological singularity. By then AIs will be fully conscious and have more intelligence than a human being and see us as competition for resources. We'll all die from a virus or nuclear explosions caused by them.
>>
>>51678316
100 years? Loled
>>
File: 1447761540489.gif (372 KB, 499x281) Image search: [Google]
1447761540489.gif
372 KB, 499x281
P O W E R 8
O
W
E
R
8
>>
The Mill
>>
>>51682702

Here is a jumping off point

http://www.extremetech.com/extreme/185984-the-human-brains-remarkably-low-power-consumption-and-how-computers-might-mimic-its-efficiency

The human brain is incredibly expensive. About 30% of the calories we eat goes to just powering this lump of junk between our ears, and that energy usage does not change based on how little or much we think. Yet humanity evolved starving, so whatever we have got, it is super optimized for energy efficiency.

Example: look at our vision system. Look around and notice how fantastic your edge detection system works. It's not that our ability to do computation is inherently bad or slow, it is just that we are not optimized for handle adding and subtracting numbers - numbers are a huge abstraction in our biological world of what am I going to eat / what is trying to eat me? Contrast that to a computer cpu that is only designed to shuttle binary numbers around. On top of a cpu's implementation we can then model other things.

We do massive amounts of computation just to move around, but you just don't think about it.
>>
>>51685420
Big misconception here:

The brain is actually ridiculously efficient compared to similarly-powerful classical computers, and it does in fact consume variable amounts of calories depending on how much it is used.

Our brains are essentially massively parallel FPGAs hooked on to a bunch of really specalized SoCs - slow as shit and only optimized for some specific workloads, but capable of reconfiguring itself to translate unoptimized workloads into semi-optimized workloads in order to leverage the power of our inbuilt SoCs.

For example, we use mnemonic translations to make memory of abstract concepts into more concrete physical/aural chunks which can then be offloaded onto our spatial recognition/aural processing SoCs.

If we've learned anything in our 70+ years of doing silicon based computing, it's that a system such as the one above is in fact highly adaptable and overall represents an amazing balance of efficiency and adaptability.
>>
>>51685802

> and it does in fact consume variable amounts of calories depending on how much it is used.

Even a q&d google search points out that this is 20-50 calories a day; pretty insignificant.

You are agreeing with me wrt overall computational efficiency though.
>>
When will arm replace x86 on the desktop
>>
>>51678878
>You should really take things a little serious when scientists make claims of impending doom.

No. You shouldn't. That's nothing more then a Hollywood meme, the super special snowflake genius scientist trying to warn the world of impending DOOOM using SCIENCE!

In reality pretty much every claim of impending disaster by a scientist has proven laughably false. They're autists who are very good in very, very narrow fields, and shit every where else.

>The guy who claims the singularity will happen in the next ~30 years is a computer scientist not a crackhead down the street.

Computer scientists have been predicting systems that will surpass the human mind in 20-30 years since fucking Charles Babbage. EVERY TIME, including this time, they completely underestimate the complexity of living brains. I can pretty much guaran-fucking-tee you that our understanding of the brain is still one level too high, if not multiple levels too high. In other words there are properties and processes that affect outcome that we don't know about, or are treating as noise.

We are no where near the hardware density required to match the human brain. And we may never be able to design/program the thing. Aside from the physical limits we're now hitting on clock speed and die shrinks, the real reason we went to multiple cores is because we can't fucking design dramatically faster cores any more. So we use the silicon to etch 2, 4, 8 copies of a core.

Problem is software engineering cannot make use of more then one except in a few embarrassingly parallel tasks. But to reach the level of the human mind requires a massively parallel software design, something beyond what we can even comprehend right now.

>muh 30 years!

We will hear the same estimate 30 years from now. I doubt it will happen in our life times, if ever.
>>
>>51686530
Not the guy you're arguing with, but a few things to chew on (just to be devil's advocate, as I do agree with much of what you're saying):

- The switch to multicore processing is actually a step in the right direction, if nature is anyone to take a cue from.

- modern software design is much more parallel than you seem to believe. Just quoting from computer gaming, which is the field I'm most familiar with (and one of the fields in which the single-core process has been especially hard to kick), every single modern game engine is capable of arbitrarily-large threading. Every single one. And in scientific and engineering compute, we are rapidly eclipsing even the multithreading ability of x86 processors to the point that the big work in these fields is almost entirely GPGPU or even dedicated SoC work. And this approach to problem solving is leaking into consumer-based software, such as bitcoin, Photoshop, etc. Technologies like CUDA, OpenCL, and Vulkan make a hugely parallel future inevitable (and in that regard, long after they potentially go bankrupt/get bought out, AMD will be vindicated in their APU strategy).

- we may be able to actually achieve conscious software (or something indistinguishable from it) without actually having to fully understand/design the program. Genetic Algorithms and similar problem space searches are capable of rendering surprising solutions to just these kinds of seemingly intractable problems, especially when instantiated in proper search spaces. (http://www.damninteresting.com/on-the-origin-of-circuits/) (http://hackaday.com/2012/07/09/on-not-designing-circuits-with-evolutionary-algorithms/)
>>
>>51686814
>- The switch to multicore processing is actually a step in the right direction, if nature is anyone to take a cue from.

It was a step taken because of our engineering limitations. If we can't make significantly more complex cores, how the hell are we going to design a system to compete with the brain? The brain isn't just "more cores", it is integrated in a way that our best cores are not. (A "core" is still discrete ALUs, FPUs, vector units, registers, data paths, etc). And yet we've already bumped into our human limits.

>- modern software design is much more parallel than you seem to believe. Just quoting from computer gaming,

Gaming...i.e. graphics and worlds with discrete regions...is embarrassingly parallel. So are many scientific research tasks.

Most modern software is very poorly threaded, if it's threaded at all. When you step outside of a few key areas, taking advantage of multiple cores becomes a programming nightmare and is another illustration of our limitations as human beings. We literally cannot trace the design out in our own minds, so we stumble and fall.

Honestly? I think materials science is not the thing that will prevent a singularity, though we are not as close as we like to believe. Our ability to comprehend and put to CAD/code such a design will be the limitation.

>- we may be able to actually achieve conscious software (or something indistinguishable from it) without actually having to fully understand/design the program. Genetic Algorithms and similar problem space searches

Heh...I remember critiquing Dr. Thompson's work.

Problem space searches work when the problem is incredibly simple. The leap to even an insect's mind is...staggering. A mammalian mind? We are likely to see the heat death of the universe before it succeeds.
>>
>>51686814

I like what you are writing.

My big problem with replicating the brain is that our brain resides within a body. We are not even close to being a pure intelligence. We are organisms fighting for survival. The hardwired routines in our minds, the things that give us drive and spirit and individuality are calculated by evolution to further our survival and are completely intertwined with our intelligence.

Of what use is pure intelligence without a body or will or individuality? Why would you get up in the morning if you are not hungry? Would you do anything at all if you were divorced from all needs? Can a human being be without need - no - in times of leisure we invent art. Why? Because we have to at some level.
>>
>>51687326
You're starting to become annoying.

You do know that our brain is actually much less integrated than our conscious experience would lead us to believe, right?

You just totally switched tracks on the software threading argument... how can gaming and scientific computing be "embarrassingly" parallel when it's parallel processing we need?

As for the rest, no, it's not. So much software is multithreaded these days that honestly I find it unusual when a program I'm running on my PC turns out to be single threaded.

Your critique of Dr. Thompson's work would probably be a tragedy to read through, given that instantiating a genetic algorithm is literally copying nature, which obviously used it to much greater effect in a shorter amount of time than you give it credit for.
>>
quantum qubit processors will be the next architecture. soon you'll have AI waifus that ask you how your day was and they'll order you hotpockets from amazon as you ban people on variou forums.
>>
>>51687409

>You do know that our brain is actually much less integrated than our conscious experience would lead us to believe, right?

Ha! you know that even our perception of linear time progression is a lie. Every time you glance around, time freeezes to allow you greater 'time' to appreciate the detail in what you are seeing.

Have you ever glanced at a clock and the second hand seems frozen? That's your brain at work freezing the moment.
>>
File: 1448860233381.jpg (16 KB, 413x395) Image search: [Google]
1448860233381.jpg
16 KB, 413x395
>>51687493
>this post
>>
>>51678378
All Intel CPUs are actually RISC and have been since at least the early 2000s if not before then.

The Cores are RISC and they use a CISC microcode layer.
>>
>>51678283
>what is axv512
intel cpus are literally 512 bit already
>>
>>51683001
>>51682702
>>51679031
>>51678878

I suggest you all look into cognitive science/philosophy of mind stuff and how it pertains to computer science. Implying computers can even begin to be conscious to begin with assumes substrate-independence and some form of computationalism, which hasn't proven to be true. Also Mental states don't one hundred percent exist anywhere material, and as such can't be objectivity tested (i.e. you can't replicate it if you can't observe it objectively).
>>
>>51678043
You're a big guy
Thread replies: 88
Thread images: 9

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.