Looks like A.R.M. is the near future

Got something to discuss that's not about Blackmagic products? Then check out the Off-Topic forum!
  • Author
  • Message
Offline

ricardo marty

  • Posts: 1612
  • Joined: Wed Apr 26, 2017 4:03 am

Looks like A.R.M. is the near future

PostWed Dec 26, 2018 3:16 pm

This sounds almost incredible specially for laptops. Power consumption cut to less than half and and computing power way over present capabilities. Maybe 8k production will be available to almost everyone.



Ricardo Marty

p.s. Don't know what forum to post this so if i'm wrong please tell me where to post articles like this.
DVR_S 18.5, Asus ProArt PD5, 2.5 GHz i7 16-Core 64GB of 3200 MHz DDR4 RAM GeForce RTX 3070 1TB M.2 NVMe Window 11, LenovoLegion 2.6 i7 10750h 2.6, 64gb 3200mhz, rtx 2060, 1tb ssd M.2 Win 11 BenQ PD3420Q, Sony FS700R, Bmp4k, Sony A6700. PreSonus AudioBox
Offline
User avatar

rick.lang

  • Posts: 17262
  • Joined: Wed Aug 22, 2012 5:41 pm
  • Location: Victoria BC Canada

Looks like A.R.M. is the near future

PostWed Dec 26, 2018 5:39 pm

Normally that might go in the Off Topic area, but no worries.

Edit
At about 8:46 of this interesting video, it refuses to play further.

Sent from my iPhone using Tapatalk
Last edited by rick.lang on Thu Dec 27, 2018 4:18 am, edited 1 time in total.
Rick Lang
Offline

Wayne Steven

  • Posts: 3362
  • Joined: Thu Aug 01, 2013 3:58 am
  • Location: Earth

Re: Looks like A.R.M. is the near future

PostThu Dec 27, 2018 1:58 am

Ahhg! Was watching bits of that last night. Wish they would summarize, even my posts are 20x faster to read then that with similar amounts of information.

I don't know which part you are referring too, but you can get an idea of how little 8k needs from history. A few years ago Ambarella released an 8k drone chipset, I suspect a low powered version to be announces at ces etc, and maybe even a 8kp60 version. But that is custom consumer likely using an ARM as the main processor. But if we look back at cineform raw it ran on original Intel duo processors fullhdp24 with no GPU processing. We can probably extrapolate from that. So, 2-4ghz for 2k. 16x that for 8k, that's 32-64ghz, or 8-16 x 4ghz, or 16-32 x 2ghz. But, now if we add GPU processing, depending on the GPU, a number of times more can be done. So, raw 8k on a phone chip is a possibility between CPU and GPU processing, 4k raw on CPU currently. The nvidia arm chip has long had enough camera pixel datarate, and latter editions official 8k support and over 1600 Gbit/s h264. But don't let that fool you. Most phones and chips with quote you the consumer codec supported filming modes. But, the chips can pull a lot more resolution off the sensor as raw data to be used for stills and downscaling for the light weight consumer codec. Hence, vizzigig has given post 4k video for years using jpeg at video rates as one example. I should ask the guy if they can do that with dng now. But this is only a fraction of what can be done unrestricted. So, those nvidias have a powerful cuda GPU implementation, and I suspect could handle 8k raw and even 50fps high bit depth. But, to put this in perspective, replace one of the many GPU cores, even just add raw processing instructions for a few thousand transistors extra, then that single core might even do it. That's speculation, but why custom circuit support is so powerful. Now, let's set sideways, dng is an used standard, apart from an alternative some equipment used, and it can be made into a back magic raw like product in a multimedia container file with more meta data. So, the exciting thing is that products will have dng custom circuits in built, and some might be suitable.

BM, could always approach Qualcomm and ask for such a circuit to be included and use that in their products. The thing needed, is just a minor edition, but needs multiple levels of chips, so you don't have 99% of the chip going to waste. Another way to go, is to approach their usual suppliers of codec Chios, and ask for a raw mode o be added. Now, you get a very tight application specific integrated circuit where every thing has a purpose, cheaper smaller, lower powered. Unlike Red, BM has a range of non camera products to use such a chip, and boost volume of the chip.
aIf you are not truthfully progressive, maybe you shouldn't say anything
bTruthful side topics in-line with or related to, the discussion accepted
cOften people deceive themselves so much they do not understand, even when the truth is explained to them
Offline

ricardo marty

  • Posts: 1612
  • Joined: Wed Apr 26, 2017 4:03 am

Re: Looks like A.R.M. is the near future

PostThu Dec 27, 2018 7:49 pm

Wayne your accurate and knowledgeable but was not my point. Please forgive me I was very frugal with words on my posting. I think that Arm will be a disruptive technology to many industries. The ability to handle Raw codecs on a simple laptop or maybe even a tablet has far reaching consequences.

In a high competition environment like the computer it will change everything starting with the giants. It will be just like the tv set business. Do you remember $15k 48" Flat televisions? and so on. The question is "will it benefit our businesses? At first glance it looks like it will but then we have the law of unintended consequences. Content will be a dime a dozen yes even high quality products. and I'm taking about top professionally made content from top talent.

Ricardo Marty
DVR_S 18.5, Asus ProArt PD5, 2.5 GHz i7 16-Core 64GB of 3200 MHz DDR4 RAM GeForce RTX 3070 1TB M.2 NVMe Window 11, LenovoLegion 2.6 i7 10750h 2.6, 64gb 3200mhz, rtx 2060, 1tb ssd M.2 Win 11 BenQ PD3420Q, Sony FS700R, Bmp4k, Sony A6700. PreSonus AudioBox
Offline

Wayne Steven

  • Posts: 3362
  • Joined: Thu Aug 01, 2013 3:58 am
  • Location: Earth

Re: Looks like A.R.M. is the near future

PostFri Dec 28, 2018 4:24 am

Thanks. Sorry. But we can now with the latest gpu's, which are much better than a ARM at various things.

I've tracked the arm since 1987 or earlier, since before the original Archimedes computer came out, but maybe before as I was thinking of an Electron computer with Arm, and the BBC Arm CPU add on module was out, and heard about them doing a CPU around 1983. Was a large stepping stone in the CPU community which I'm part of. But it has become bloated with nearly as many instructions supported as Intel (well over 1000) where my own design proposal has less than 16 instructions aiming at 5 nanometer to fit the CPU core, and a design that's meant to replace GPU and DSP, FPGA, and asic in various circuit, hopefully 5Ghz low powered in today's processes (though if only 2Ghz is achievable, that is a significant improvement. But these are going be much bigger than 5nm), and much much more in tomorrow's technology, with I don't know how far past 10Ghz in today's technology on a high energy circuit. Many years of thinking to get get around the many constrictions of today's designs and constrictions applied from today's process technology, and shifting from normal CPU chip process technology to another.

At the moment they are much better off looking at next generation ARM technology aimed to compete with Risc-V, the current open source challenger, and scrap the existing instruction set support. They literally would he better investing in bringing RISC V up to scratch over the current ARM.

I'm not doing much on this lately and not keeping up with the whole processor advancements and scene, but arm is just a step in the right direction decades overdue, but not a Quantum Leap, but not like the Quantum Leap (reference to a computer I had, promised much, but was just a bit better, but nice machine).

Back before or after at the announcement of the Archimedes computer, they were talking about the arm being a main business machine CPU, but that went quiet. They probably found the market was tied up and people locked into Intel/Motorola were not willing to play ball, plus the collaborate power PC processor and Intel's x86 replacement, would have focussed attention on more feature mature technologies. If it wasn't for an outbreak of viruses, it might not have mattered so much, as the Arm was the most powerful desktop CPU by far (though the mob I was supporting had a more mature innovative powerful CPU, in a gate array format mind you) and things like strong arm kept that tradition alive for a while. The integrated arm chip technology was a leader, making cheap business machines possible, which would give them market for years, and time and money to mature the technology for unix like systems abd features to move higher. Simply, the simpler ARM design allowed lower power/higher performance in a power envelope but needed the investment to design that. The Risc V misstep also promises this over the ARM using a lot of architectural CPU design lessons over the decades. In the end, that might not matter so much as the CPU core becomes less and less of a greater design where processing is more and more concentrated outside the CPU. So, on a 64 core chip, great over Intel, still better than arm, but on an integrated chip where the CPU is only a little of the processing ability, not so great. My design is meant to be most of the design, so it matters more. At this scale we come down to task perform per unit of energy, ultimate performance, and to a lesser extent performance and energy per unit of area. Existing cpu level benchmarks don't matter as much, everything changes so much, you are looking at benchmarks more like application benchmarking. So, you would look between 10-100x the performance density and up to 1000x on today's machines. Even of that only translates into 10x the application performance (or 100x giving a heap of applications in parallel). That's still great. You notice the figures look a bit funny, that is because a lot of the performance per unit of energy comes from lower power, not greater throughput. The processing is also light weight instructions, but because you have lower power, you can pack processors more densely, but because you still get heat build up you can only go so dense and have to reduce speed. However, what I'm working on now is next generation process technology that goes a lot faster. But, I seriously need big money right now, and a team of scientist engineers. Jim and Grant are wasting money. The low cost custom silicon manufacturing scheme I mentioned elsewhere, which could deliver a $100 8k or 16k camera with storage could use this technology, but simply it can be as small as a cinema sensor (storage allowing, as that might require 3D array which is something not so simple).
aIf you are not truthfully progressive, maybe you shouldn't say anything
bTruthful side topics in-line with or related to, the discussion accepted
cOften people deceive themselves so much they do not understand, even when the truth is explained to them

Return to Off-Topic

Who is online

Users browsing this forum: No registered users and 25 guests