Ferguson Missouri

 | August 27, 2014 4:20 am

I’ve been unable to avoid reading about what’s been happening in Ferguson, Missouri.  It’s hard to imagine how it feels to have your son shot down callously by law-enforcement, to have nothing happen to the shooter, and to have the follow-up disregard and disrepesct the victims and the concerned citizens affected by the murder.

What’s more dismaying is the fact that whites across America are not in an uproar about how their fellow citizens are being treated, simply because of their skin color.  The sentiment seems to be “c’mon we have a black president, this couldn’t have been racism”.  The degree to which people are selectively processing events in order to maintain this kind of attitude is impressive.

For me it’s absolutely clear that Police are predominantly racist.  I suspect black policement are equally or more racist on average than their white colleagues, although I have to admit that’s only a supposition.  I did have one experience which fixed this opinion in my head though.

Years ago, while studying at Georgia Tech in Atlanta, I caused a traffic accident with my bicycle.  I pulled into North Avenue to see if I could safely make a left-hand turn (traffic blocking my view to the right).  A truck was coming fast so I pulled back.  The driver panicked (after whipping past me),  and slamed on the brakes, causing the aged sedan behind — driven by mid thirties black man with partner and two children as passengers — to collide into the truck.

Now, It was entirely clear that either I, or the white truck driver was to blame, but of course the police (one black, one white) ran the drivers licences of all involved.  The black family man must have an outstanding warrent, because the police cuffed him and took him away in the patrol car.  Okay, so far so good, sucks for the guy but presumably he did something to get himself in that situation.

The man was at all times polite and helpful.  Despite this, the black police officer felt the need to humiliate the guy in front of his children by hoisting him up by the trousers (classic bully wedgie) and shoving him into the polic car — laughing while doing it.

I suspect the white cop wouldn’t have behaved that way.  I’m think, having had my own encounters with the police (and being white) that a white guy in that situation would have been treated a little better.  I am sure that if the guy had been wealthy, drivinbg a mercedes or something, he would certainly have been treated differently.

So maybe it’s not racism.  Maybe it’s classism combined with a callous, bullying, us-vs-them mentality.  Maybe the racism is merely a coincidence — the high correlation between poverty and race caused by our racist history.  I can’t really say.   Either way, the behavior is shameful, and we need to do something about how American police behave.

Share on Facebook

London today.

 | July 23, 2014 12:38 am

In London today for a bit of a holiday.  It’s a stopover on the way to Norwitch, where we’re going to a friend’s wedding.

We took the train through the chunnel.  There are a number of trade-offs involved in traveling by rail over travelling  by air.  The total travel time increases, but that increase is less significant than one might think at first glance.  First, we save the trip to the airport, which saves about a half hour.   Second, the airport wants us to check in 90 min in advance, which is unecessary at the train station.  For a trip to London, the train station is in the middle of the city.  Once the train arrives, we have arrived, whereas arrival at the airport means another hour till we get to our destination — given the need to get luggage, disembark, etc.  Finall, If I take the average of airplane delays (in my life maybe a half hour) and subtract the average of train delays (less than a minute on the average in my life), I arrive at a compensation of 3 hours, door to door.  So for short trips — which from Switzerland means Italy, Germany, France, etc. it’s often more efficient use of time.  Of course, going to London takes 8 hours by train, given that one has to transfer from Gar de Lyon to Gar de Nord, and the ridiculous checkin procedures for the chunnel train.  The flight is approximately 2h, meaning  door-2-door time of 5 hours.  So I would save 3 hours travelling by plane over travelling by plane.   Overall it’s a loss, but not a terrible one.

There are other concerns.  Of those 8 hours, I spend a couple of hours in Paris’s largest rail stations.  I find this enjoyable.  It’s a brief visit to Paris, in which one sees only a little of the city, but rather a lot of the cities’ people.  While it doesn’t really count as a visit to Paris, it counts a great deal more than a stop-over flight would.  It’s rather the same when travelling through a country by train.  When one travels over an area by plane, one sees no more of that area than one would on Google maps.   When you sit on a train as it travels through a land, it makes stops.  People embark and disembark.  Often you get to observer a small slice of their life as you rocket through through their lands at speed which were unheard of even a hundred years ago.    So it’s my belief that one sees and experiences more by rail than by air.

Financially, there’s a real penalty for travelling by train.   This is not because air-flight is more efficient than train travel.  Quite the opposite in fact — the efficiency of train travel, and the implications for that on climate change and resource consumption,  is our primary motivations for choosing it over flight.  This is simply an artefact the enormous subsidies that airlines enjoy.   Should rail travel enjoy such a large level of support from our taxes, the economics of rail travel vs air travel would likely  change.  This can only change if governments begin to show more wisdome with regards to issues like climate change and resource consumption, which in turn is only likely to improve when citizens grow more wise.

Which brings me to the real reason we travel by rail, even when there is a financial and temporal price to be payed.  When we travel by air we rob from future generations and the poor of the planet, as it is future generations and the planet’s poor who suffer most from climate change our modern excesses.  Once, when explaining my reasons to a  colleague who should already have understood them, he replied with some banality about the need to be happy balanced against some uncertain future.  For the vast bulk of human history people have been unable to travel by air, let alone travel cheaply with wanton abandon.  Are we only now able to be happy?  In fact evidence suggests that modern excesses only make us less happy.   While I can’t prove it, I’m comfortable asserting that selfishness and short-sightedness can only diminish our happyness.  Implying that flying is a vital component of happyness is as absurd as asserting that the the suffering of climate change is uncertain, or even in the future.

Share on Facebook

Cleaning out “using namespace std;” declarations from header files.

 | June 10, 2014 8:00 am

I recently had to work with a very large codebase, in which each and every file included a header file with the statement “using namespace
std;” in it.
This led to the situation that hundreds of header files, using std strings, pairs, etc, were using those items without any std:: dereferencing.

Cleaning this situation up by hand would have taken weeks and been error prone, so I wrote a little script to do it for me, and called it standardize.pl

The variable possible_offenders is a list of std c++ names which are frequently in place in the code in question.

The script recursively searches a directory for for .h and .cpp files. For cpp files, it checks if any of the possible_offenders occur in the file.  If so, it adds a “using namespace  standard” directive if none exists.  Thus cpp files are changed minimally.

For header files, all occurences of “using namespace std” are removed, and all occurences of possible_offenders are prefaced by an explicit std:: namespace specification.  Care is taken not to change occurences in comments or in quotations.

If you are faced with a similar situation, you can find the script on github: https://github.com/spacemoose/standardize

Share on Facebook

cout, buffering, and premature pessimizations.

 | June 5, 2014 4:10 am

The other day I was in a discussion with some c++ developers, where one of them stated, most definitavely, that “cout is not buffered”.   Now I have to admit that I was flabbergasted by how wrong this assertion was, and my first instict was to question the capabilities of the developer in question.  As I look back on the vast majority of the code that I have worked with however, it’s pretty clear that most c++ developers are either unaware that cout is indeed buffered, or they are unaware of the side effect of std::endl, or they just don’t think about the impact it causes.  Consider the following two lines of code:

  std::cout << "some text" << std::endl;
  std::cout << "some text\n";

Now, neither of these lines of code is more readable than the other.  Neither is more maintainable than the other.   The endl variant can however take significantly more time to execute than the \n variant. Why? Because std::endl has two effects:

  1.  It inserts a ‘\n’.
  2. It iinserts a std::flush into the stream, flushing the buffer.

If you are using a std::endl where a ‘\n’ will do (i.e. you do not need to explicitly flush the buffer), you are creating what Sutter and Alexandrescu call a “premature pessimization” in their excellent book “C++ coding standards”.   Despite this, the endl variant is much more common.  Whenever I ask anyone why they are using endl’s all over the place instead of \n,, the typical answer is “well, it’s more the C++ way to do things”.  That’s just not true — it’s not the C++ way to do meaningless resource comsumption.

Rule 1 is “don’t optmize prematurely.”  This means you should not make your code less readable, more complex, or less maintainable for the sake of dubious performance benefits.  A correllary to this rule however is “don’t pessimize prematurely”.  If two variants are equally readable, equally clean, and equally maintainable, prefer the more efficient variant.  This is just a question of correcting ignorance and forming good habits.

So you might be curious if this performance difference is measurable, and the answer is of course it is.  You can test it yourself with the following benchmark:

#include <iostream>
#include <chrono>

namespace chrono = std::chrono;

int main ()
{

	constexpr unsigned int numLines = 100000;
	auto start = chrono::high_resolution_clock::now();
	for (unsigned int i =0; i< numLines; ++i)
	{
		std::cout << "This is a prematurely pessimized line" << std::endl;
	}
	auto pess = chrono::high_resolution_clock::now();
	for (unsigned int i=0; i<numLines; ++i)
	{
		std::cout << "This is not a prematurely pessimized line\n";
	}
	std::cout << std::endl;   // flush the buffer so the comparison is
							  // only biased in favor of pessimized
							  // code.
	auto np = chrono::high_resolution_clock::now();
	double durp = chrono::duration_cast<chrono::milliseconds> (pess-start).count();
	double durnp = chrono::duration_cast<chrono::milliseconds> (np - pess).count();
	/// Use cerr for benchmark results, so we can redirect the noise.
	std::cerr << "\n==============================\n"
			  << "pessimized code took: " << durp << "ms.\n"
			  << "unpessimized  took  : " << durnp << "ms.\n"
			  << "Buffering saved: " << durp-durnp << "ms., or " << 100* (durp-durnp)/durp
			  << "% speedup." << std::endl;

}

Compiling on gcc with -O2, I get the following results:

Output to termial:

==============================
pessimized code took: 5370ms.
unpessimized  took  : 4806ms.
Buffering saved: 564ms., or 10.5028% speedup.
~/Personal/Miscelaneous[master]$ ./a.out > /dev/null

Output to /dev/null
==============================
pessimized code took: 45ms.
unpessimized  took  : 6ms.
Buffering saved: 39ms., or 86.6667% speedup.
~/Personal/Miscelaneous[master]$ ./a.out > tmp

Output to file:

==============================
pessimized code took: 365ms.
unpessimized  took  : 79ms.
Buffering saved: 286ms., or 78.3562% speedup.

If you are writing code which uses output streams a lot, like logger functionality, or file output, this can make a huge difference to your resource consumption, and you’ll never see the needless waste in a profiler.  So form good habits.  Unless you need to flush the buffer for some reason (which in fact is a rare need unless you’re dealing with concurrency issues), prefer the \n construct.

 

Share on Facebook

Swiss nonconfrontationalism.

 | June 2, 2014 12:37 pm

I have been living in Switzerland since September of 1996. At time of writing I have been living in Switzerland for 17 years. I would estimate I have lived in the United States for 14 years, and Canada for 12 years. So alltogether I have lived in the Canmerican culture for 26 years, some 9 years more than I have lived in Switzerland. This gives me an unusual perspective on the three cultures.

I have frequenlty heard the complaint that Swiss people are averse to confrontation, and as a result never say anything directly. This can manifest itself in wierd ways, like leaving notes in the laundry room saying “So bitte nicht!” (not like this!). I hear this criticism from Swiss people as much as I have heard it from fellow foreigners. People often make a conceptual connection to Switzerland’s political neutrality, which I never came to myself. I’m a big fan of Switzerland’s political neutrality. That national characteristic is a large part of why I feel at home here. I actually get a bit resentful towards foreigners who live here for the money, but despise Switzerland’s neutrality.

Until recently though I have still been critical of Switzerland’s aversion to confrontation at a personal level. For an American it makes navigating relationships with a Swiss partner fraughtful. Sometimes it seems the characteristic is taken to some cartoonish extreme — a case where this characteristic is possessed to such a cartoonish extreme as be unbelievale (at least to someone from without the culture). This can lead to some pretty bizarre relationship stories. As a result of collecting some negative experiences I could quite relate to people expressing resentment of this trait.

Now I find myself in a wonderful, happy relationships with a Swiss woman. While she is an impressive woman in her own right as an individual, some of the things that I love most about are quite “typically Swiss”. In fact she takes a real pride in being Swiss, and cherishes Swiss culture in a deep and meaningful way. She is wise in her love of her culture. She doesn’t love her country and culture indescriminately. She confronts many of the very real problems that Swiss culture does possess in a very head on and responsible way. Besides making her a remarkable human being, this characteristic makes her a remarkable teacher. The integration of Switzerland’s overwhelmingly large immigrant population is a real and pressing issue, and her ability to love and critisize her country simultaneously and intelligently, is a real gift to her students.

So my love and respect for my wife makes me look at certain aspects of Swiss culture in a more loving and respectful way. While Bettina’s tendency to avoid confrontation is afrequent topic of discussion in our family, my tendency to dive directly into conftration is of equal concern. I believe though that I have more to learn (and have learned more) from the Swiss characteristic. By now I have worked with a number of Swiss managers, and I believe that the most effective of these also pursued this path aversion-to-conflict.

The problem is that saying “aversion to conflict” isn’t really precise enough. At a cultural level the desire is avoid emotions flaring up durign a conflict, which would disturb the problem solving process. As I write that I find that I can’t help but think of Swiss political neutrality. So now that I am able to love them both, I see these traits as beeing two aspects of a single trait. When I resented one and loved the other I was unable to see the connection. I find that fascinating.

Share on Facebook

Consume less, fly less.

 | November 29, 2013 3:10 am

I recently watched this video, which I felt did a fantastic job of representing my feelings. I’m really impressed with Kevin Anderson and Alice Bows-Larking.

Share on Facebook

Haste makes waste

 | November 25, 2013 6:41 am

When I was in university I attended a couple of semesters of psychology. One of things that made the strongest impression on me was a graph the professor showed of human performance vs. motivation.  I googled around a bit to see if I could a copy of the curve, and I came across the following, which offers a little more interpretation than my psych prof. did, but does an excellent job of communicating the point.  It comes from an excellent post  at psyprogrammer.com.

 

.human-performance-curve

 

As I understand it this curve comes originally from studies on factory productivity.  While it’s difficult to obtain bulletproof data on more difficult to quantify tasks like programming, my take from the psych lecture was that this curve extends pretty universally across all human activity.  This is not only matches against personal experience well, but seems to be pretty well accepted by the scientific community. An interesting aspect of the curve is it’s assymetric nature. Being slightly undermotivated has significantly less negative impact than being overmotivated by the same degree.

It’s an unfortunate truth that most people in managerial roles are woefully unfamiliar with this graph.  They typically overwork both themselves and their employees, to the detriment of productivity, all the while taking a macho sense of pride in how hard they work.  I can’t tell you how often I have heard people bragging about the excessive number of hours they work, while the rest of us groan under the weight of their emotional instability, error-prone work, and poor judgment.

Even when employees are disciplined about working reasonable hours, getting sufficient rest, and maintaining a positive work-live balance, excessive demands take a toll on productivity and this is no less true in the case of software development than it is any other task.  Software development is a kind of craft, in which intellectual discipline, creative thought and disciplined craftsmanship must all be combined to provide optimal results.    Managers who seek to whip their development teams up into a frenzy of panicked development wind up destroying their teams productivity.

I have all-too-often found myself in the position where management comes by the development teams on a regular basis to tell them “If we don’t develop feature X in time Y, we will go out of business.  The fate of the company is in your hands!”  Typically X and Y vary heavily, even within the same project, sometimes over very short time intervals.  Every time I have found myself in this situation, the deadline has been missed, and the company survived.  Besides resulting in skepticism and distrust in management, this has some very harmful effects.

In the best case, the programmers ignore the dire warnings, maintain a zen like attitude to their work and continue to strive to perform to the best of their abilities.  In this case the only harm is destruction of trust in management.  I’ve never actually seen this case,  but it is a theoretical outcome.  In the worst case people take the warnings/threats seriously, start hurrying production, and start sacrificing their personal time to the goals of the project.  Shortcuts are taken and technical debt is accrued in an attempt to make short term goals.  Developer get stressed and start to resent design discussions, make the discussions longer and less fruitful.  Developers get tired and their judgment and emotional equilibrium is impaired.  Stress leads to strife and illness.  In their haste developers stop taking time to mull their algorithms over, and errors creep in.  The time it takes to fix these haste-bred bug dwarfs the time it would have taken to calmly develop a correct implementation, which spirals the project further in stress, cynicism and despair.  Stressed out workers get sick, and feel pressured to work anyway so they infect their stressed coworkers leading to more unproductivity.  Eventually the project collapses under the weight of exhaustion  mistakes,  strife, and technical debt.  Actual results tend to fall somewhere in between these two extremes, but heavily weighted towards the latter outcome.

I meditate regularly.  I read books on Zen and try to set what I learn into practice.  I exersize religiously, eat healthy and have a very happy relationship.  Still I find myself slipping into unproductive stress levels when harangued by emotional and irrational managers.  Today I spent about an hour tracking down and debugging a completely moronic error.  I had written the following code:

/// if there is a next element, and it should be written, write a ",\n",
/// otherwise write a ;\n"
m_fstream &lt;&lt; (( (i+1 &lt; res.GetSize()) &amp;&amp; shl::is_a_supported_type(res.GetItem(i+1).GetChoice()))?",":";")
&lt;&lt; "\n";

It actually does exactly what the comment says does, but it should in fact only write a semicolon if we are at the end of input.  This bug crept in under precisely the conditions I described above:  a manager ranting and raving on a daily basis about how we would go out of business if this feature wasn’t implemented yesterday, and why the heck is it taking so long and so on.  I was chastised for spending so long testing the code which was delivered, but the above bug wasn’t detected for two weeks, which implies that no-one noticed the error for two weeks.  Since this bug prevents data from being successfully loaded to the client application this casts a bit of doubt on how urgent the update really was.  In any case we still have to re-build and deploy the bug fix to the customer before he can use the feature, which means another hour or two of developer time, and a total of about 3 weeks of delay for the customer.  All because I felt stressed enough to get out of my zone and make a completely stupid mistake I would never make under less hurried circumstances.  Doubtless this is not the last such bug I will discover either.

All of this is an avoidable phenomena if one understands the relationship between motivation and performance.  If you really care about doing your best work, you won’t let yourself get overmotivated (stressed, hurried), and will try to keep yourself in the optimum zone.  Sadly this is terribly difficult if your manager is a stress junkie who perceives health productive workers as undermotivated.

 

Share on Facebook

“Standardize” is not a synonym for “improve”.

 | September 12, 2013 1:07 am

When I was working for a mega corporation, one of the recurring battles that I had to fight was the standardization battle. The assumption was “if a little standardization is a good thing, then a lot of standardization must be even better”. Of course this is simply not true. Excessive standardization stifle’s creativity, decreases productivity and increases risk. Standardization is a useful tool to increase productivity where it appropriate. But just as you shouldn’t hammer in screw, there are places where you think about deregulating. not standardizing.

Recently I found myself in a conversation with a new coworker who asked me “So what are you standardizing?” (out of the blue, no context). I was a bit mystified and replied that I wasn’t standardizing anything, I was developing code. On a separate occasion he asked me if I had any ideas how we could “standardize our testing”.

One of the sad truths at my current workplace is we simply don’t have enough testing going. We need to improve our testing. We need to extend it. We need to develop it. We don’t need to standardize it though… what good would that do?

To standardize something is to take something is inhomogeneous and make it more homogeneous. This is a useful technique — for example standardizing communication protocols and power outlets has been a tremendous boon. In a software shop that is just doing chaotic testing you don’t start with the question “how can I standardize this” you start with the question “how can I improve this”. It may be that developing standards is a one of tools you use, but it may not be. In our shop we have three teams — a back-end team, a middleware team, and a front-end team.

For the back-end team we need more unit tests, automated functionality tests, and automated integration tests. We don’t really need any manual test execution. For the front-end it’s quite different. There we need a database of manually run usability tests, most of which will probably have to be executed manually. This means that even the testing schedules will likely have to be quite different, meaning standardization will probably only play a small role, although formalization might play a larger one.

So all of this went through my head when the question was asked and I replied “Standardization is wrong word. We need to improve our testing. We are working on that.”. The disturbing thing in the conversation is the realization that the word “standardization” seems to have lost its very specific meaning. I suspect that when managers and executives get in a room and someone says “We have standardized our testing”, everyone responds “bravo, well done”, rather than asking the obvious question — “what benefit did the standardization bring?”.

Share on Facebook

 | August 22, 2013 2:54 am

Considering that:

  •     It’s best practice to do warning free builds.  Those harmless warnings you are ignoring could be hiding an important warning that’s bugging up your code.
  •     Recent version of GCC support disabling of specific warnings.
  •     Sometimes we use external libraries which we trust, or must accept — we don’t want to muck around with their internals — say for example the boost libraries.

Given these points, when we include a header which creates warnings, we’d like to disable just those warnings, for just those header files.  This can be done with the following statements for the GCC compiler:

#pragma GCC diagnostic ignored "-Wparentheses"
#pragma GCC diagnostic ignored "-Wswitch"
#include <tl_base_data.h>;
#pragma GCC diagnostic warning "-Wswitch"
#pragma GCC diagnostic warning "-Wparentheses"

This comes up often enough I cranked out a trivial bit of elisp so I can do this in emacs a bit more automatically:

 

; @todo let this work if we have a range too.
(defun insert-pragmas (pragma-name)
"Wrap the current line with a pragma to disable the warning."
(interactive "MWarning to disable: ")
(if (string= "" pragma-name)
(message "ignoring empty pragma name")
(move-beginning-of-line nil)
(insert "#pragma GCC diagnostic ignored \"-W" pragma-name "\"\n")
(move-end-of-line nil)
(insert "\n#pragma GCC diagnostic warning \"-W" pragma-name "\"\n")
)
)

Share on Facebook

Trayvon Martin

 | July 18, 2013 2:00 am

My response to the Trayvon Martin verdict has been complex. I didn’t follow the case closely, having read primarily superficial discussions of the case until that point. When I heard about the verdict I was disappointed but not surprised. When Rodney Kings attackers were acquitted I was shocked and surprised. The Zimmeran verdict shows that white America hasn’t gotten much more civilized. I think one could argue that black America has, since the response from black Americans has been reasoned and reasonable. One can hope it will effective as well.

I found some of the response worth commenting on. William Saletan wrote an article on Slate titled “Your Are Not Trayvon Martin”, which I found particularly disturbing.  His thesis is that:

The problem at the core of this case wasn’t race or guns. The problem was assumption, misperception, and overreaction. And that cycle hasn’t ended with the verdict. It has escalated.

He goes on to explain that Zimmerman wasn’t a racist, that the whole thing was a stupid sequence of event based on fear, overreaction and misperception.  All of the latter points would seem to be factually true.  Whether or not Zimmerman is a racist is a bit more of a difficult question.  Standards as to what constitute being a racist change over time.  Behavior an opinions which would have been considered egalitarian and progressive in 1892 would be considered backward and rampantly racist today.  The evidence does suggest that Zimmerman is no self-identifying racist longing for the glory days of slavery or the clan… but he would seem to be a racist to the extent that he finds a black teenager walking down the sidewalk in a hoody scary and suspicious.

Zimmerman’s attorney mad the assertion (28:02 of the video below) claims that Zimmerman would never have been charged with a crime, had Zimmerman been black. That’s an interesting assertion. Had Zimmerman been black and killed Trayvon Martin, would he have been charged? I don’t know. But I think we should ask ourselves some other questions:

  1. If Trayvon had been white, and Zimmerman black, what do you think would have happened.  The statistics would suggest the death penalty would be likely.
  2. If Trayvon had been white, would Zimmerman have acted as he did?
  3. Put yourself in Travon’s situation:  You’re walking to a friend’s home wearing a hoody and carrying a bag of skittles and soda pop.  Some guy starts giving you a hard time, so you run away.   He runs after you.  You defend yourself and he shoots you dead.  Wouldn’t you want the justice system to take some action against the guy?


I think it’s exceeding naive to think that race, prejudice and racial profiling had nothing to do with this sequence of events.  I’m annoyed at journalists’ attempts (like those of mr. Saletan) to dismiss people’s efforts to address the problems that black people still face in America.  People, including my white self, are angry and frustrated at this outcome and what it says about our society.

Share on Facebook