I am not sure how much “power” I ideally need to handle 4K seamlessly and without any real delays while the RAM and processor and screen card is chuntering.
I am inclined by the new Mac Book Pro 15 inch (I know, I know dont mention it) but the range goes from the basic 2.3Ghz 8 core to the 2.4Ghz, but with options of up to 32GB of ram and three different Radeon cards (and a price shift of nearly £1K from bottom to top).
How much difference in the real world is there likely to be? I have read the reviews (such that they are) and I am not sure I am any the wiser.
Fuji, if you want to be able to scrub 4k you will need a lot more power than you can buy off the shelf. I have been checking out FinalCut pro and adobe premier and the only way I can get what I want if I am cutting something longer than a minute is to have it built.The cooling alone needs 4 fans.Cost of parts is about £3000 plus and then paying someone to put it together if you are like me.
For very short edits I am told that some people are buying old macs (I can’t remember which but it looks a bit like a playstation) ripping them apart and adapting them for the purpose. I don’t totally understand the details.
4k and seemlessly doesn’t fit well in the same sentence. If you are not slowed down by mass storage and ram, it will at least take 4 times longer to process the same footage wrt 1080p. So you can buy a 32 core CPU in place of your 8 could core. Frequency is playing a bit, but they not that much. And water-cooling solutions may be worth of you want to keep it quiet.
Thank you for your comments so far. A little more research suggests some conflicting views. I read somewhere one chap saying the top spec new Macbook Pro handles 4K very well, and another, that it didnt. The first said he had purchased the computer, the second didnt say. It is very frustrating when you are not sure who writes from first hand experience.
Fuji I can give you a list of the equipment used in pro 4k and 8k edit suites in the US.You can then compare it to the speeds you can get out of the latest macbook. Nearly all the hardware bits are available through Amazon.If you wish just pm me as I am sure the whole list would be very boring to most. By the way this kit makes a great simulator during downtime.
There is 4K and there is 4K.
If you shoot a freshly groomed ski slope, in sunlight, with a shaking camera, you will need at least 200mbps (megabits per second) of final mpeg (compressed!) data rate to render such a scene, and I very much doubt any laptop will play such a video. In fact there is no consumer-level camera which will shoot 4K at 200mbps; the highest I know about is 100mbps (the Sony X3000 I have will do it and no doubt others too since this is 2 years old). You will also fill up a 256GB SD card at an eye-watering speed.
At the other end of the scale, shooting 4K of a stationary object will be a few mbps.
In between the two, shooting say a talking head with a stationary background (the gimbal stabilised cameras like the DJI Osmo Pocket do that very well) produces a low bandwidth of say 10mbps.
The 4K demos in the TV shops have a specially rigged DVD player, playing back specially produced footage which has moving scenes optimised for low bandwidth. And a lot of CGI stuff which inherently has a low bandwidth due to many exactly identical colours etc.
I sometimes pop into the Apple shop in Brighton and play back some of my Vimeo videos (generally rendered to 50mbps/HD although recently I have gone down to 20mbps) on their nice big screen devices, and they tend to struggle on the 50mbps ones… although that could be due to limited ADSL speeds to the shop.
I have a near-top-end PC (12 core) with the fastest graphics card that is still fanless, and it plays back at least 50mbps. It won’t play back 200mbps.
As Gallois says, video editing 4K is also slow, even with fast hardware. I have Vegas Pro 16 and it does it, but it isn’t pleasant, with everything running slowly.
On a 15" laptop, 4K looks almost same as HD (1920×1080).
And at the end, think about where the end result will be hosted. Vimeo (better than Youtube) downsamples HD to 5mbps and 4K to 22mbps. So 4K looks pretty crap unless it was carefully shot for a low bitrate (gimbal stabilised camera, not much moving texture, etc).
Gallois that is very kind thank you please do PM.
My concern from what I read is even then an assessment on the specs alone can be challenging, because it is said that heat build up often means that notebooks dont perform as well as the specs would suggest, hence my hope for some real world experience.
To be fair, I am definitely not in the realms of professional editing, but I now would prefer to do my filming in 4K rather than throttle back on the quality, but when I come to editing the performance on my older Macbook Pro is almost too slow to be of value – albeit it does just about do it.
Peter – thank you. I work with a few people in the film industry and apparently no one is interested now in anything less than 4K and with a high rate as well – that is just the way it is. As I say I have no aspirations to join the professional scene, but if the new Macbook Pro would do the job, I would prefer to take advantage, if only because in years to come, as we usually seem to find, true 4K becomes mundane, with everything working with it just fine.
I work with a few people in the film industry and apparently no one is interested now in anything less than 4K and with a high rate as well – that is just the way it is
Professionally I am sure that is true. You have to do what the clients want. But the “pro” 8K cameras cost well into 5 digits. They record RAW onto SSDs… @AdamFrisch is the specialist here
For “home movies” I think 4K is often wasted. I use it if there is geometric (lens) correction required (usually the case with the Go-Pros, not with the X3000) and would use it if needing to extract hi-res stills from the footage, but otherwise the result has no extra usability over high quality (say 50mbps) HD.
I work with high end digital moving images on a daily basis (mainly feature films).
The human eye is limited to 3K. This was demonstrated about 10 years ago in the NFT in London to an audience of professionals. The projection was switching between 2K (the current standard for DCP) and 4K. We could only tell the difference when approaching the screen to within about 1.5m.
In order to see the 3K (limit of the human eye) several conditions have to be fulfilled:
1. perfect projection equipment and lenses
2. perfect eyesight (only a few percent of the population need apply)
3. you are supposed to sit in a row where you can see the entire screen without moving your head;
4K certainly has its uses in postproduction, more cropping options, but in the consumer world, 2K with higher bitrate and more bits per colour would bring more viewing pleasure.
About 6 weeks ago I worked on ‘Guava island’. This film with Rihanna can be seen on Amazon prime. We received 5K files and delivered 4K processed images (adding film grain in order to create a 1960s look). When I compare the files we delivered to the UHD I can see on Amazon, quite a bit is lost in compression.
4K is the standard and today’s future proofing. We can argue it’s overkill, but it’s here to stay. Netflix demands all their shows to be shot natively 4K. This created a huge problem for the industry as the “standard” for image capture has for a long time been the Arri Alexa Mini. It is limited to 3.2K, however. So a lot of scrambling and shift had to be done to Red’s or bigger formats (LF) last year etc. Now, as Large Format framing (Alexa LF, Monstro VV, Sony Venice) is rapidly becoming the new standard, this is no longer a problem (as they’re all 4K or higher natively). Arri just announced they will release a 4K Mini in the fall.