It’s been ages since last I wrote something here. I’d been planning to do a follow-up on my “Traits are Evil” post, which for some reason still gets quite a lot of traffic. The main reason why I didn’t though, is what I’m writing about now. That post, in case you missed it, was about PHP’s recently added feature: Traits. I wrote down a number of fundamental problems with their implementation, and intended to write a follow-up listing a few valid uses.
The reasons behind this are twofold: PHP is a language that has a long history of being abused by people who don’t really understand the intricacies of programming. I predicted (and still do) that traits will be wildly abused.
The other reason is simply this: I’ve not written any PHP in over a year, and it has been bliss.
Almost 2 years ago, I ranted about how I can’t help but feel that companies are pushing “the cloud” as a means to bind (or even subjugate) consumers. I sort of hoped that I would be branded a paranoid psychotic for putting things as sharply as I did.
Sadly, I recently stumbled across a post written by one of Apple’s customers who wrote about how Apple deemed it acceptable for their software to wipe 120Gb of files from their HD. Not Apple’s software, but someone’s ACTUAL DATA.
If you buy a television, you wouldn’t accept the vendor to walk into your home, and burn all of your DVD’s, would you? Why would you not hold Apple’s music-related software to a similar standard?
Anyway, here’s the post I’m on about:
“The software is functioning as intended,” said Amber. “Wait,” I asked, “so it’s supposed to delete my personal files from my internal hard drive without asking my permission?” “Yes,” she replied. …
Source: Apple Stole My Music. No, Seriously.
When traits first saw the light of day, with the release of PHP 5.4, I didn’t really think much of it. I believed (and still do) that the alleged shortcomings of single inheritance exist mainly in the mind of the developer. If ever you run into the situation where you say to yourself “Damned, if only there was a way to inherit from both these classes”, chances are there’s a flaw with either one of the existing classes, or the one you’re writing.
The most commonly, the root cause of the problem is code that doesn’t adhere to the SOLID principles. If you’re not too familiar with concepts like DI, SRP, LSP and such, you might want to read up on them first, because I’m assuming a reasonable understanding of these principles throughout this post.
Anyway, back to multiple inheritance, or rather: back away from it. I’m not the only one to say that multiple inheritance is something to avoid as much as possible, and people Bjarne Stroustrup (creator of C++) has said that there’s nothing you can do with multiple inheritance that can’t be done with single inheritance. The full quote/reference can be found in the conclusion of this rant.
If you are thinking about learning C or C++, but have been “warned” by some that it’s really difficult, then chances are that conversation was about pointers. When I first started learning some more low-level languages, the reactions I got were either something like: “Ouch, I never really got the hang of that pointer business, you know…” on the one hand. The other reaction I got was: “Have fun, get it out of your system. Once you realize what that entails, you’ll be begging to get back to a high-level language”.
Who was right? I honestly have to say that neither of these two groups got it quite right. Sure pointers were confusing at first, and sure, writing a program that reads files, and works some magic with strings is a lot easier in python than it is in C, but I’ve really grown to like C. As for pointers: As far as I’m concerned, they really, really, really aren’t that hard. All you have to do is stick with it, until the penny drops.
What follows are a couple of the common pitfalls people encounter when first trying to use pointers in C and, perhaps more importantly, how to avoid them.
I am a developer. Like most programmers, I try to follow the mantra “Be lazy, don’t write code that already has been written”. I preach this general truth to anyone who wants to hear it, and even more to those who don’t want to.
If you are anything like most programmers I know, this notion of copy-with-pride is nothing new. Yet, I can’t help wondering: Why are there still developers who work with proprietary software? Surely, the biggest issue with proprietary software affecting us, developers, on a daily basis is that we see all these tools but can’t reuse any of its building blocks.
Warning – I’ve written this post in one fell swoop, and I am going to click the publish button within the next minute. I’ll be editing out the rough patches in the next few days. The topic of this post, however, is something I feel very passionate about, and I needed to get it published ASAP, even if it only gets one reader. I apologize for any odd sentences, typo’s and other types of inconsistencies that a decent edit would’ve fixed. Leave a comment, and I’ll fix them as soon as I can.
It’s been the case, for the last couple of years, that “the cloud” has surpassed the status of buzzword, and made its way into the daily vocabulary of the common man. It’s also somewhat of an accepted fact that, within the next couple of years, a lot of data and work is going to move towards the cloud. Of course, there are major benefits to working in the cloud.
Cloud technology is brilliantly marketed as being the ultimate expression of freedom with responsibility. Many middle-managers firmly believe that the cloud is what the internet always should have been: infinite storage with easy-to-control access and enough security. But is it? Continue reading