Menu Sign In Contact FAQ
Banner
Welcome to our forums

Windows 10 eating my OS C??

Jan_Olieslagers wrote:

with the current prices of harddisks

Which is exactly why I use RAID. I have one array at home, where it’s always accessible. The backup machines are in different datacenters. I don’t really need a RAID, JBOD would be just fine, but I have the space and the disks are cheap, so why not. At least a simple failure of a disk doesn’t interrupt my work.

Peter wrote:

I think my next PC will be fanless.

Years ago I had a machine from silentmaxx. It worked just fine but since then I solved the issue by removing all the desktop workstations and replacing them with rack mounted machines living in a cabinet in a basement. I don’t have to any longer care about how a computer looks or how noisy it is.

trashing it yourself and probably not realising it for days, weeks, months or years

Fortunately I can use versioning for most of the things I do. It sometimes eats a lot of space but we already established that space is cheap. And I’m very careful with the tools that can cause damage. Actually, I take care to setup things so I can do as little damage as possible without seriously compromising usability under the account I normally use.

Read again, Peter: my mirroring is done at the O/S level, the disks are connected to the motherboard sata connectors. You are quite right that dedicated RAID controllers are to be shunned. And btw, the Linux raid software mdadm is perfectly able to tell a virgin disk from one that has been configured in an array: it writes its own signature in the beginning of the disk, outside the data area.
Also, when the power supply blows out you replace it and that’s that – I’ve never had one destroy other components with over voltage, though that is not impossible. Avoid buying the cheapest, that applies to everything but especially to power supplies.

Also: if you loose some files and don’t realise the fact for several months or even years, they were perhaps not so very crucial? If you have crucial data that is only rarely accessed, an extra copy to a DVD is inexpensive and easy to inventarise.

But I agree that most data loss is caused by human error (which I also see at work, where restores are among my duties).

Last Edited by at 05 Nov 13:39
EBZH Kiewit, Belgium

OTOH Mirroring the disks is easy, and if one disk breaks you simply replace it

Hmmm… interesting that that doesn’t usually work IME. The RAID 1 controllers I have used (pricey Adaptec ones) need a manually triggered process to accept the newly inserted HD into the array. One can see why… unless the new HD is totally devoid of data, the controller cannot safely assume which of the drives has the “good” data on it In the bigger arrays it possibly can and then you can have a simple hot-swap scheme. Apparently google uses massive arrays of dirt cheap hard drives.

However, all this is moot when by far the most common way of losing data is

  • trashing it yourself and probably not realising it for days, weeks, months or years
  • the power supply blows up and takes everything out

and then not much will save you if you don’t have backups and a proper strategy to maintain multiple past backups. And almost nobody does the latter.

Administrator
Shoreham EGKA, United Kingdom

My 0.02 euros:

-) with the current prices of harddisks, raid5-6 aren’t worth the bother. OTOH Mirroring the disks is easy, and if one disk breaks you simply replace it. Easiest if you can buy one with the same geometry.

-) virtualising Windows is fine IF you do not need extensive facilities. My accounting software runs on Win98, and runs fine and always has. OTOH when I loose the medical and want to take up FlightSim again, I’ll acquire a dedicated Windows machine for that.

My setup: like Peter, I bought the components for 2 PC’s, but then I built 2 PC’s from them Both have two harddisks, mirrored through the Ubuntu operating system, the important directories are synchronised between them with rsync. For the next couple, I hope to use some kind of distributed file system. And I must remember to buy five identical harddisks. I do still need to find a solution for disaster recovery, i.e. keeping a copy of my data outside the house. An external harddisk (USB?), kept in the car, is my best idea up till now.

EBZH Kiewit, Belgium

Not from my experience. I deal with loads of designers in my day to day work, and I can tell you categorically that it is a 100% native Mac environment.

Yes – I was referring to the capability of the hardware and the software.

Many years ago, a Mac was the only way to do a lot of stuff. Today, most of the software exists on both platforms and one can do the same work on a PC.

The hardware is the same and has been for years – ever since Apple moved to IBM compatible hardware. That didn’t please their “fan club” but presumably Apple did it to

  • hedge their bets
  • they were presented with a fait accompli from Intel etc who wanted to make only the standard PC chips

I think my next PC will be fanless. But they aren’t cheap – something like this We have some fanless servers at work and they have proved to be really reliable. They dissipate about 20W (I5 processor). But today anything less than a top-end I7 is pointless for a new PC.

Administrator
Shoreham EGKA, United Kingdom

Flyer59 wrote:

The Mac is still the #1 computer in DTP and the creative field

Concerning the creative field, does the Mac Pro have ECC VRAM? I don’t think so, but I’m not sure. That would be really funny since it has ECC RAM and offloads some workload to the GPU. I thought ECC was desirable for at least video rendering. And how well is supported 3D modelling in the Mac world?

Last Edited by Martin at 05 Nov 11:16

stevelup wrote:

Bootcamp is a pain as you have to reboot your machine and pretty much defeats the point.

Exactly. People usually multitask. If I reboot to Windows, I have to do everything in Windows. It would make no sense to reboot to MacOS just to write an e-mail. I might just as well use Windows for work full time and perhaps have Mac separately for office stuff, browsing, etc. Virtualization allows you to run Windows and MacOS applications side by side, easily switching between them. Unfortunately, a lot of the applications I use are heavy on the graphics side. And with a Mac I can forget about certified drivers.

For sure, Apple owned the DTP market for years. But that changed a long time ago.

Not from my experience. I deal with loads of designers in my day to day work, and I can tell you categorically that it is a 100% native Mac environment.

Go into any agency and all you will see is vast swathes of Apple kit.

I tried all the virtual machines, and hate it all. You never have the same performance.

I just start it with W7, do what i need to do in W, and shut it down asap :-)

Peter wrote:

For sure, Apple owned the DTP market for years. But that changed a long time ago.

Yes. From what I remember, they used Macs quite a lot in the 90’s. I deleted that sentence because I wanted to check which program it was and forgot. I don’t really know what they use now, I’m no longer in touch with the people I knew in this field.

One always learns the hard way to do backups; sometimes too lateā€¦ Nowadays, when I build a PC, I buy two motherboards, two processors, two video cards, two power supplies

Well, I learned in a course in high school. I don’t bother with that. Computers last me usually a very long time. By the time they pack, everything has changed. And if something dies early, replacement usually isn’t an issue. For me, it’s just a waste of money. And when I really need a machine to work, I just buy one from Dell (or HP) with proper support. That too might be a waste of money, but it has worked well over the years. This is another reason I don’t want to use Apple for work – insufficient support.

On a (sort of) funny note, before there were official Apple resellers, one unofficial offered NBD service just like Dell. I think it was provided by the same company. And that was great. However, when they became official, this offer disappeared overnight. I wasn’t pleased about that at all.

RAID will be useful only if

I don’t really agree. RAID 5 or 6 (and some others) work just fine without a single drive and without having the drive mirrored. You just might want to avoid rebuilding the array. And if the disk isn’t new (say, less than 6 months old), it’s about two years old or more, there is a good chance you will lose more in the near future. SW RAID has fewer issues, in my opinion – as long as the drives themselves survive (at least enough of them to rebuild the array), I can recover. I have verified it in practice (granted, simulated, but the hardware was completely different, just the drives were common). Where I’ll sort of agree is that I consider it more important to have at least two arrays on two independent machines (best in two different locations), as I said earlier. Single array, no matter which it is (SW/ HW, whichever level) won’t protect you from failure of common components. Unless we go enterprise grade with drives that support two controllers. And even then there is a risk of a single failure taking out the whole thing, if you are unlucky.

You can use HW RAID to gain performance even with SSDs, but I don’t really see the point since we have PCIe drives available.

58 Posts
Sign in to add your message

Back to Top