Advertisements
Skip to content

Recent Articles

10
Jul

Deadlines: All Taking, No Giving

x5hpcdnjma931

It’s pretty much an open secret that in software development, asinine (and sometimes completely outrageous) deadlines, like estimates, are a way of life. The comic below shows what is–sadly–a typical representation of your average private sector software engineer.

1tng8lpffih21

That being said, it should come as no surprise that everybody has advice on how to try and combat the problem. From TechRepublic‘s 10 tips for meeting IT project deadlines to the utilization of agile software methodologies (see Deadlines in Agile by Stephan Kristiansen). Besides the usual suspects of scope creep, inept employees and managers, and micromanagement from both up top and from the customer. But for the most part, it’s typically managers that cause the most issues, sometimes through no fault of their own (for instance, if the company is riding on the release of a product, it’s either get it done or everyone is out of a job, and us developers don’t get golden parachutes).

A typical communication flow, ideally, would look like below. The bold red arrows represent directives, requirements, and specifications for the project (e.g., deadlines). The thinner black lines represent feedback. This is then processed and sent back down through the bold red arrows.

hierarchy1

This is what it looks like in real life, at least from my perspective based on my experiences.

hierarchy2

The thickest red lines are the same before. The thinner (but still thick, cause I like ’em thicc) red lines represent pressure, chastising, or other negative reactions. Notice that there is no feedback loop, only a one way path. This phenomenon is called shit rolls downhill. It is a common trope among any team that develops a product.

Some may argue that the idea is to get a product out the door before a competitor eats your lunch. While I agree with that, it can also be argued that getting a quality product out the door that stands apart from your competitors is just as important. To achieve this balance between speed of delivery and quality. Always refer back to the Good/Cheap/Fast paradigm:

0_BXO-Krt9rSiopn6k

I can appreciate that people up top have a vision. I, and many other developers like me, want that vision to become a reality because we share it, too (in most cases; I’m not naive to the fact that some of us just want a paycheck). But reality can be disappointing, and because we live in that reality management has to be open to the fact that some things just cannot be helped. It’s fine you want a deadline, but you have to be realistic about it.

But they try to get around this.

Next time, we’ll examine the Good/Cheap/Fast paradigm and the Mythical Man Month and how wily managers try to cheat the system.

 

Advertisements
1
Jul

The Love/Hate Relationship with Estimates

WindowsLiveWriter-ComicReliefDilbert_9F2D-

Love ’em or hate ’em, estimates are a manager’s best friend and an engineer’s (developer/producer/whipping post) nightmare.

And they’re not going anywhere.

Reading the comment section on Hacker News today on the article Dear Agile, I’m Tired of Pretending by Charles Lambdin when the subject on estimations came up. Specifically, this comment by bougiefever:

I’ve been developing software for over 20 years, and I still can’t estimate how long something will take me when I’ve never done it before. This uncertainty needs to become more than just a stick to beat developers about the head and shoulders with. Most of the time the PMs understand this, but there have been many projects where they just don’t get it. I have suffered great anxiety from being forced to give estimates when the truth is I have no clue. It depends on how easy it is and how many unforeseen issues I encounter. It was so bad that once my husband asked me how long it would be before I was done cooking something, and I practically had a meltdown. That’s when I knew it was time to leave that team. Can we stop pretending we can forecast the unknown?

This was countered by xwdv:

No.

Even bad estimates are better than no estimates. If you are having meltdowns your reputation is being tied too closely to your ability to give estimates.

You must never turn estimates into a promise, always remind people they are estimates. (Emphasis mine –ATH)

Want to give fast estimates? Here’s how:

1) first determine the scale of the task? Is it a year, month, week or day kind of task?

2) Then, it’s just 3 of those units. The smallest task takes 3 days. One day to completely fuck up, one day to figure out why, one day to get right. The longest takes 3 years. One year to fuck it all up, one year to learn why, one year to finish it.

I suggest never giving estimates in units smaller than a day. They just become noise. If a task is smaller than dayscale just say the task is too small to provide any meaningful estimate but won’t take more than a day.

And lastly, to help solidify the talking points of this post, this response to the above by bb88:

> Even bad estimates are better than no estimates.

No estimate is clearly better. (Emphasis mine –ATH) Here’s a common story I’ve seen across multiple companies.

1. Marketing management asks Engineering management how long it takes to do feature X so they know when to launch the online ad campaign.

2. Engineering management then asks potentially good coder how long it will take. Coder replies with a time and “it’s just an estimate.”

3. Engineering management reports to Marketing that coder’s estimate leaving off the most important caveat, and Marketing treats that as the gospel truth.

4. Coder takes longer than expected because of some bad technical cruft that some other engineer put in because he was potentially rushed or just plain inept.

5. Marketing is pissed because they now have to withdraw the ad campaign, and starts blaming engineering.

6. Under increased scrutiny, Engineering gets a bad reputation, who then throws the coder under the bus in front of Marketing and other managers.

7. This shows up on the coder’s annual review who then leaves.

8. Engineering hires replacement which will have a 3-6 month learning cycle, and potentially writes worse code than the person that just left.

EDIT: The point is that if there’s no estimate, management has to deal with the uncertainty that the coder experiences. Hope for the best, plan for the worst.

There are two distinct statuses that a software developer will be looked upon by those outside of their respective realm within an organization: as wizards that can turn out miraculous, spectacular(?) software, or as insufferable buffoons who seem to forget that in some cases people’s lives are in their hands and yet they still manage to drop the ball.

This is why I hate it when the upper echelon of my current employer call me “the man”–this placing of me on a pedestal is dangerous and sets a very bad precedent of me being a 24×7 miracle worker. I’m not the Jesus of software engineering, folks.

With estimates, in response to xwdv, while you, the programmer, may reach a mutual agreement with the individual or group requesting the estimate that it is in no way, shape, or form, a concrete definite obligation that it’s going to fall within that time frame, that agreement is moot when they go and construct it as a promise to the next party, whether it be upper management, the customer, or your own mother. And this has led to some small uprisings in the software development community, such as the #NoEstimates movement. From the article Estimates? We Don’t Need No Stinking Estimates! by Scott Rosenberg:

The annals of software-project history are packed with epic train-wrecks. The best-documented ones are in the public sector, including the FAA and the FBI and Healthcare.gov. Private industry is better at keeping its pain to itself, but when the full tales of slow-motion belly-flops like Microsoft’s Windows Vista get told, it isn’t pretty. The most-cited numbers on software-project failure are those of the Standish Group, a consulting outfit that reported that in 2012 only 39 percent of software projects were considered “successful.”

Late software projects run up costs, incur collateral damage and sometimes take down entire companies. And so the software industry has devoted decades to waging a war on lateness — trying frontal assault, enfilade, sabotage, diplomacy and bribes, and using tactics with names such as object oriented programming, the Rational Unified Process, open-source, agile and extreme programming.

Estimates play a part in nearly all of these approaches. Estimates are the siege-engines of the war on lateness. If we use them carefully and patiently and relentlessly, the hope is, maybe, eventually, we’ll win.

Why is software so late? One venerable intellectual tradition in the field says the answer lies in software’s very nature. Since code costs nothing to copy, programmers are, uniquely, always solving new problems. If the problem already had a solution, you’d just grab a copy from the shelf. On top of that, we have a very hard time saying when any piece of software is “done.”

Non-engineers need to understand that software development is not an exact science; a lot of it involves trial-and-error. When the stakes are high, we need time to ensure that everything is copacetic. I know this advice is going to fall on deaf ears no matter how many times I try to teach management and others about how this industry works, but I know you, dear reader, will agree. In any case, perhaps this is a case for business coaches to start teaching their clients about software estimation and project management.

Whatever the case, estimations are not going anywhere, period. Otherwise, you’ll get pie-in-the-sky vaporware that, if it ever does come out, fails to live up to its expectations (“You spent X [months/years] on it! Why does it suck!”) and becomes yet another argument for either waterfall estimation or agile burndown charts (Duke Nukem Forever anyone?). The lack of estimations is merely cannon fodder for management to rubber stamp you as incompetent and find some other schmuck to make that promise. Worse yet, when that check bounces, you’ll still bear the majority brunt of the blow back.

If that’s the case, you might as well project something insane and walk it back. Hey, under-promise and over-deliver, right?

 

23
Jun

Visual Basic: Not as Shiny, Still as Viable

Today I came across the article The Rise and Fall of Visual Basic by Matthew MacDonald on Hacker News.

Today, Visual Basic is in a strange position. It has roughly 0% of the mindshare among professional developers—it doesn’t even chart in professional developer surveys or show up in GitHub repositories. However, it’s still out there in the wild, holding Office macros together, powering old Access databases and ancient ASP web pages, and attracting .NET newcomers. The TIOBE index, which attempts to gauge language popularity by looking at search results, still ranks VB in the top five most talked-about languages.

But it seems that the momentum has shifted for the last time. In 2017, Microsoft announced that it would begin adding new language features to C# that might never appear in Visual Basic. The change doesn’t return VB to ugly duckling status, but it does take away some of its .NET status.

Truthfully, the trend to sideline VB has been there for years. Serious developers know that key parts of .NET are written in C#. They know that C# is the language of choice for presentations, books, courses, and developer workshops. If you want to speak VB, it won’t harm the applications you build, but it might handicap your ability to talk to other developers.

[…]

Visual Basic has been threatened before. But this time feels different. It seems like the sun is finally setting on one of the world’s most popular programming languages. Even if it’s true, Visual Basic won’t disappear for decades. Instead, it will become another legacy product, an overlooked tool without a passion or future. Whether we’ve lost something special—or finally put an old dog out of its misery—is for you to decide.

Visual Basic may not be getting much love these days, but its functionality is still there. Mads Torgersen says that the ongoing strategy for VB, as of 2017, is to “do everything necessary to keep it a first class citizen of the .NET ecosystem” by focusing “innovation on the core scenarios and domains where VB is popular.” It is still being actively developed, used, and implemented.

The point is, a language’s popularity should not determine its usefulness. Michael Born said it best in his article Yes, CF is “Unpopular”. No, I don’t care.

Popularity is not the end goal. I’ll say it again: popularity is not the end goal! If your language of choice has a scheduled release every two years and hundreds of thousands of active developers, it won’t matter one bit unless that language is useful. Node.js, as a language, is almost worthless without its immense open-source ecosystem. You won’t find any real-world applications running on Node without the use of dozens or hundreds of npm libraries simply because Node is not useful in and of itself.

That’s not a bad thing! Node is an excellent language to learn, and is very powerful thanks to its immense popularity and large package ecosystem – but just remember that without the ecosystem, Node as a language would be a footnote in the annals of history.

We really need to stop with these asinine sunset articles and gloom-and-doom rants on programming languages. More importantly, we need to stop treating them as fads, religions, or special memberships to the cool kids clubs.

 

9
Jun

YouTube: Turn Off the Lights, the Party’s Over

Was the YouTube Partner Program a viable way to make a career? When a machine is in the wheelhouse, probably not.

With the latest ad apocalypse (dubbed the “Voxpocalypse“), another large swath of YouTube content creators are finding themselves in the demonetization (and channel purge) list as YouTube cranks up their aggressive crackdown on content not deemed suitable for advertisers. Worse yet for some, the targeted deplatforming campaigns brought on by individuals caught in their crosshairs is a “taking a bull by the horns” approach to ensuring that major content sponsors are pulling out, decreasing their monthly income (we are ignoring whether or not this is warranted because I really don’t feel like going down that rabbit hole).

I’ve been on the Internet since 1996, starting with a 56K baud modem in the absolute remote wilderness of North Carolina and Internet Explorer on Windows 95. Back then, what content (whether it be videos, blog posts, images) people created were typically self-hosted on shared hosting services, such as GeoCities or Angelfire, or, if you were serious about your work, your paid hosting provider with a custom domain name. People made money off advertising networks that displayed banner ads on their sites. In some cases, they could do very well if the traffic was good (this was in the age before ad blockers became a thing); most of the time, webmasters made money by selling merchandise or asking for donations.

The YouTube Partner Program launched in December 2007. Since then, many have tried (and ultimately failed) to make a career out of YouTube. In the article 96.5% of YouTubers Don’t Earn Enough to Cross the Poverty Line, Study Finds by Daniel Sanchez:

[In the year 2017], the team at Information is Beautiful found that content creators only made $1,472 [USD] after 2.2 million video views. This year [2018], The Trichordist noted that the video streaming platform paid a paltry $0.00074 [USD] per stream, a slight uptick from last year’s $0.0006 [USD] rate. With the company’s recent update to its monetization eligibility policy, a new study has found these numbers will only continue to get worse.

It goes without saying that the lion’s share of revenue generated by advertisement revenue goes to the platform itself. Profitability, however, still remains a mystery. From the article Believe It or Not, YouTube May Spend More on Content than Netflix Does by Adam Levy:

If [Evercore ISI investment bank analyst Ken Sena’s] estimates are accurate, YouTube will account for approximately 5% of Alphabet’s total revenue in 2015. That percentage is expected to rise as YouTube grows faster than the more mature Google business. But compared to Google’s high-margin search engine and display advertising revenue, YouTube isn’t very profitable. The business has yet to generate bottom-line earnings, and it’s not clear that 2015 will be any different.

YouTube continues to invest in infrastructure, employees, and new products. Most recently, YouTube rolled out YouTube Red, which combines a full-fledged music streaming service with ad-free YouTube videos as well as original productions for $9.99 per month — the same price as Netflix and other music streaming services.

Eventually, YouTube will ease off the gas pedal and start turning a profit for Alphabet investors, just as Netflix will do for its investors. At that point, we might expect profit margins to climb to levels more in line with Google’s other major revenue sources. Last quarter, Alphabet reported an operating profit margin of 33%. At that rate, YouTube could add about $3 billion of operating profit to Alphabet in 2020. That’s about 20% of the company’s 2014 operating income.

That said, YouTube is going to put their own preservation interests first and foremost. What’s even more problematic for content creators is that YouTube’s biggest competitors–the traditional media–are keenly aware of the delicate relationship that YouTube has with its advertisers. Thus, when a hit piece comes out that could potentially paint the companies in a bad light, the algorithm is changed, videos and channels are tossed, behaviors are modified through demonetization, and YouTube tries to make nice with the old hat media by placing them front and center. Then you have people with an agenda who make waves, triggering another wave of demonitizations and channel/video removals. Then there are disputes between online personalities and journalists. Soon it will come to a point where so much as passing gas accidentally on camera will erupt in a mass exodus of advertisers.

To me, this just screams that people need to go back to what we started with: self-hosting, self-maintaining, and self-reliance. With the advent of cloud computing and scalability, it’s (theoretically) easier now to establish your own corner on the ‘net without the need to rely on a platform.

Of course, this doesn’t mean that large providers–Microsoft, Google, Amazon–aren’t immune to social pressure. But in this case, the content producer is also a paying customer, not a means to an end to generate advertising revenue.

30
May

Appealing to Authority in Software Engineering

A great article I recommend reading for all programmers (and non-programmers alike) is Logical fallacies in software engineering by Artur Martsinkovyski.

A human brain is a complex machine that evolved over millennia. It works in the most peculiar ways possible and allows us to excel both at perception, dexterity and mind work. Some of its functions are a bit hacky. There are a lot of stereotypes and shortcuts that our mind takes in order to be more efficient and take less energy to fulfill the task. It helps most of the time although, being overwhelmingly erroneous at times, so that it leans you to the wrong decision, building an incorrect map of reality. The lenses of your perception may be flawed, the mechanism that grinds up the information you collect may be malfunctioning, your mapping can be highly incorrect.

Such errors have a name. This name is ‘fallacy’. A fallacy is reasoning that is evaluated as logically incorrect and that undermines the logical validity of the argument and permits its recognition as unsound.

Like other people of mind work, software engineers require a lot of thinking, analysis, and mapping of reality to fulfill their job. While doing these processes, our mind sometimes takes shorter routes to reach the destination, leading to wrong decisions or poor planning. To avoid that it is better to know your flaws.

He then proceeds to list the following fallacies:

  • Nirvana (perfect-solution)
  • Appeal to authority (argument from authority, argumentum ad verecundiam)
  • Historian’s fallacy
  • Misleading vividness
  • Survivorship bias

Let’s take a look at what I feel is the most common fallacy we as software engineers (or any professional, really) faces constantly: Appeal to authority.

[A]n assertion is deemed true because of the position or authority of the person asserting it. When explaining practices or opinions on some subjects of sofware development, project management, operations, e.t.c, people tend to use somebody elses saying, blogpost, conference talk or other claim as a foundation for justification of their own decision. Even though it might not always be fallacious, mostly it is better to add more contextual arguments that apply to specific solution or project rather than appealing to authority.

This can be a bit problematic when you have multiple people vying for a shred of authority in their role. I worked at a location once where everybody around me nitpicked every line of code I emitted because it didn’t follow their ideas. When challenged, they would fall back on either “This is the way he [architect] taught us to do it” or “This is a best practice” or the implied “I’m older than you and therefore smarter than you so you should bow down to me.”

I think programmers are now afraid of taking the risk of applying what they know and have experienced and instead are falling back on the results of others. Let’s say that you write a method that causes a memory leak. When you’re called out onto the carpet, you explain that the way in which you constructed the code was based on the directed opinion and approval of the project architect. Even though you noticed the potential for a leak, you ignored it because, hey, why should you doubt the architect? Isn’t s/he supposed to be the smartest person in the room? In turn, you get to be thrown under the bus while the authority that you yielded to denies all accusations.

At least with blogs, talks, and other tangible items, they can’t fight back.