Recording Events

While dealing with recordings from a recent event (the second edition of OpenSaturday, some pics here), I had a sudden flashback to when I put some time into recording things that were going on.

These are a few points to take into account, to maximize usable footage:

Don’t move the camera

No matter what you do, avoid moving the camera. Before the event starts, look for a good place. Take into account where people are going to be, and make sure that no one will be in the way. Make it point to the interesting place (ie: the stage) and cover as much of the interesting space as possible. If it’s a talk, coordinate with the speaker – make sure their movement range is covered by the camera.

Use more than one camera

The preference is that they overlap. Because something will be wrong with the first camera, and remember you can’t move it. A second camera pointed at the area of interest from another angle should do the trick. Because someone will walk over the range of view of the first camera, or one of the cameras will fall, or the speaker will walk outside of the foreseen territory.

A third one, pointing at a possibly interesting place, like the audience, may be moved from time to time. When you point at a place, count at least three minutes before touching it again. Not in your head. Use a clock.

Those angles should be good enough to provide material for transitions and capturing reactions or otherwise editing the material.

Record continuously

Edition can shorten a long talk or event, but it can’t make it longer. What’s lost is lost.

Run a test drive

Set the cameras to record, simulate some talking from different places, move around, get the recordings and play them. If it’s all good, then it’s all good.

Get the audio

If possible, use an independent audio recording device in addition to the camera’s.

Have Fun Editing

Now you’ll have a lot of material to choose from, and will be able to get good angles and good sound on most of what is going on in the talk or event. The picture will be steady and clear. This is already a lot better that what is usually managed by amateurs.

Avoid an excess of transitions, and look for a narrative. I like to watch stand up comedy videos and take cues from the cuts to the public and angle changes. If we stick to this, and improve upon it, the communities will all be better off, because sharing talks will be less awkward.

Some Comments On Code Optimization

I often code for fun, building tiny toys to amuse myself. Be it simple board games that play themselves (tic-tac-toe and checkers), or some simulations (I have some Windows 7 gadgets laying around), or… well, you get the idea.

In the simulations, for which I developed a taste thanks to a former coworker, I tend to sit down and consider every little calculation I’m going to be performing. I usually run them in an HTML canvas, which both makes them suitable for use as Windows 7 desktop gadgets and lets me have good fun in a healthy environment: It’s easier to e-mail one html file than an exe, and the tools to profile browser javascript code are good to be familiar with.

In any case, one of the things I learned while doing this is that the way in which you run the loops has a tiny impact on performance. It adds up when you try to calculate an immense amount of vectors applied to an immense ammount of particles in a dynamic array… and as we’re running on top of a VM on top of a browser on top of the OS, every little bit counts. Whenever possible, I write the loops in the form

while(n–>0){
// do stuff
}

Why? Javascript implementations, just like the Java VM and the CPython runtime and x86 assembly have special codes to represent a handful of numbers which usually include 0, -1, and 1 through 5 at least. Not a big deal – unless you’re avoiding cache misses. As it’s a cheap optimization to make (doesn’t hurt readability), whenever it’s logical I use it.

This kind of optimization is nice and clean fun, but not necessarily impactful. The best way to optimize code is:

  1. Write it in a readable way. Forget about optimizations.
  2. Measure the hotspots, that is: measure where time is being spent
  3. Optimize that

Of course, before ever writing a line of code, some thought must be given to what you’re going to write. The fastest code is the one which does the least, at any given level. In the example above, the optimization is at the bytecode or assembly level; when you choose the wrong data structure or the wrong algorithm, you affect every level. This is the value of that “algorithms and data structures” class or book which you didn’t pay too much mind to at school.

I’m going to put forward two examples of choice, one of data structure and one of algorithm.

An Example of Data Structure Choice

If you need to organize a list of existing data, you may choose to use a list – after all, the amount of data to be organized could change in time. But different implementations of list have different characteristics. In Java, for example, you have LinkedList and ArrayList implementations of a List (among others). Any implementation of List can be used as an argument for of those you could invoke Collections.sort(List <T> list). This implementations, of course, mirror what you could write by hand if you wanted to. Indeed, I’d urge you to write an ArrayList implementation.

In any case: Which one would you use?

If you chose the ArrayList, pat yourself in the back. To organize data, you need to constantly compare elements, and the fastest way to do that is with a contiguous, indexed block of memory. The difference in performance can be observed in a desktop organizing random integers.

An Example of Algorithm

I’ll share something very real: An e-mail from GNU grep author Mike Haertel in a FreeBSD mailing list in 2010, in which he dissects the reason GNU grep is faster than BSD grep. Long story short: GNU grep executes fewer operations to achieve the same goal. Read it through, I could hardly improve upon Mr. Haertel’s exposition.

Conclusion

Toy around, and have fun with your micro-optimizations all you like. But be careful with the algorithm and data structure you choose.

Sometimes you’re calling some library or external service only to discard the result if some condition isn’t met. This kind of mistake is especially abundant in inherited codebases which accrue functionality over time.

Most important: if code is being slow, use a profiler. Optimize the hot spots, leave the rest alone. How good can this be? Read the answer here. Have fun :^)

On Sharing Knowledge

We’re filled with many amazing abilities which we consider mundane because we’re so used to them. Some of them are mentioned and described here.

Sharing knowledge is one such an ability – if only a few people were capable of doing it, the group would be inconceivably more powerful than the rest of humanity combined. In a matter of perhaps months, they’d accumulate experience beyond what anyone else could and be able to apply it accordingly, earning the capacity to outdo the rest of existence.

Now, that scenario is perhaps too extreme – and too far removed from our daily lives. But even a slight difference in the capacity to share knowledge can compound heavily in time; this is directly observable by seeing the difference between teams where people are jealous of their knowledge and position and teams where members are more carefree about their information and techniques. It is also observable in the difference between someone dabbling blindly (or just observing people) in a particular discipline or area of knowledge and someone with one or more mentors.

Transfering knowledge through text, sound or performance, and particularly with the more deliberate variants (dissertations, presentations, essays, tutoring, mentoring, demonstration) pushes the learner’s progress in ways that can be non-linear, helping people develop previously unsuspected insights.

The achievement of this insights is so pervasive among hackers – which are a knowledge work oriented population if there is one – that there are terms for different textures of insights: “zenning” and “grokking“. There are people who consistently help achieve insights, and those achieve folk-hero status among hackers. Brian Kernighan and Donald Knuth are two superlative examples of this breed.

We should, then, as a society, embrace, support and boost efforts aimed to sharing knowledge. We do try, really… the school system is facilitating mass, intergenerational knowledge transfer with some success.

There’s been a wave of internet users turned teachers, mentors and sharers that gives me hope. Systematic efforts like Open Source Ecology and Khan Academy fill me with hope. The sometimes pell mell efforts by individuals sharing info on subjects as diverse as personal appearance and cooking make me ecstatic.

Yes, books are there, and other media. But this is a new kind of effort, which takes the power of the internet and the services currently running on it (such as search and hosting) to reach an amazing number of people who can remix, enrich and reshare knowledge with thousands of points of view from all walks of life. The potential for amazing results are huge, as the knowledge being shared grows exponentially.

There is potential for things to go wrong, which I will write about in the future. Usually, you should avoid sharing dangerous information. How to build explosives, or break into computer systems, or do harmful stuff.

For all other things, please, do share. Teach, learn. Due to the accumulation of the effects, our future can be exponentially better – or worse – depending on this.

Want to learn programming?

Derek Sivers has been a great inspiration for me in many aspects. My now page was basically his idea – there’s a whole movement around what people are focused on doing at the moment, and the sort-of community feeling keeps me accountable.

Some time ago, Derek wrote about how nice it would be to just have someone tell you what to do, and the fruit of that line of thought is is “Do This. Directives – part 1” article, which appears to be the first of a series.

I don’t yet have directives; not hard and fast ones. I used to have a page up with things you should read and work through, which would level you up. It’s more in the style of How To Become a Hacker.

I’ve not taught many people to code, so I don’t have the crystallized view of what, exactly, would get you to become a competent programmer. While that comes along – I pretend to teach many people to code in the near future – here’s a rendition of my advice on becoming a competent programmer:

  • Work through Learn Python the Hard Way. Internalize the mechanism for learning.
  • Read How to Become a Hacker and follow its advice
  • Work through Nand2Tetris
  • Work as much as you can through seminal works, like The Art of Computer Programming and Structure and Interpretation of Computer Programs
  • Join usergroups in your vicinity.
  • Get something done – something nice for yourself. At this point you should’ve already.
  • When you make mistakes: Identify them. Catalogue them. Learn to avoid the whole category of the mistake, if possible. Share it, that others may learn to avoid it.
  • Stop saying you’ll do it. Stop wondering whether this has something to do with what you want to do. Stop making to do lists. Start. Now. Hurry.

See you on the other side 🙂

Innovation conundrum

Coordinating people is hard; I think it’s the next step from giving away information that allows for wealth creation in particular niches.

Even if you have a small group of enthusiasts who are discovering how to clobber a particular set of problems, once a subset of those have become trivialized hardships ensue: who’s going to do the now boring work? Or take the now easily grasped fruits of work?

In the digital/computing realm, there’s still enough stuff that needs to get done out there that this is no big issue. In the physical realm, the issue is further complicated by lack of access to raw material that enables the execution.

Being able to work out deals or mechanisms that allow people to feel content doing their part, enabling the exploitation of “already solved” areas without hindering the discovery of new solutions is very valuable. Companies with good R&D departments that help the business thrive while not suffocating innovation manage to do it.

In order to be able to share wealth properly, we need to find good solutions to this innovation conundrum, which can be summed up in two points:

  • solutions make whole areas at the same time boring and exciting (for different kinds of people)
  • tensions arise in resource allocation, both between parties that want to exploit the newly discovered improvements and between the “exploiting” and “innovating” camps.

Avoiding Arguments

Sometimes arguments are not crucial to your ends.

As a means to get people to understand you, correct your ideas, help you shape the lens through which you see the world, arguments are amazing. But sometimes you just need to get something done.

Sometimes, you’re committed to a particular opinion, and are certain enough of your correction that you don’t want to waste time arguing.

I put a high price on certainty – the more certain you are, the more you should be willing to bet, be it in money, comfort or the possibility of winding working twice as much if you’re wrong. If you’re really certain, sometimes you just have to put your money where your mouth is, and commit. Offer to carry the burden.

“I’ll have our back if something goes wrong”

“I’ll be responsible for this, if we make it this way”

“If we go through this path, it will be so much easier, I’m willing to take a bigger chunk of work”

At other times, this won’t work – mostly because someone else is equally invested in a way to do things which is incompatible with yours. Offer the other person the chance to take responsibility; put them on the spot. If they don’t step up – well, I sure hope you’re right, because things are most likely going down your way.

Retrospective: Clean Code – Boy Scouts, Writers, and Mythical Creatures

Yesterday I gave a talk on Clean Code, based in content by Uncle Bob and Geoffrey Gerriets.

I had some technical issues – so I had no access to my presenter notes, damping my performance somewhat… and after I’d taken such pains to learn from Geoffrey’s talk at PyCaribbean on Code Review, Revision and Technical Debt.

The subtitle for my talk was: “Clean Code, Boy Scouts, Writers, and Mythical Creatures”.

It starts out by talking about the features of clean code, as described by Uncle Bob and his interviewed few in his book Clean Code – and comparing each group of aspects to physical things we can look up to, like a Tesla Model X, a pleasant beach, or a bike… all of which share traits desirable in our code.

Then going on to the maxim of “leaving the campground better than we found it”, with a nice example of some code taken from the IOCCC and how much more legible it became merely by reindenting it,  putting in relief the long term impact of little incremental changes.

The latter half of the talk was derived from lessons learned at Geoffrey’s talk: the process of a professional writer, compared to the process of a professional coder, and how they’re alike; the lessons form the writers’ day to day can be applied to our coding: design, write, revise, rewrite, proofread; some attention was given to the way that reviews may be given. The mythical creatures section –  which represent the different stages at which a developer may find himself – are an aid to this latter part of the talk by pointing out patterns of behavior that identify what may be important or not for a certain developer at a certain point in their growth. The advice to treat things that may be beneath a developer’s level as trivia and/or minutiae, as well as the advice on focusing and choosing improvements to point out instead of “trouble” may be the best of this part of the talk.

After realizing I’d burnt through the presentation and posing some questions to the audience, we discussed some interesting points:

  • How can code comments make code cleaner or dirtier?
  • How can rewrites alter our coding behavior?
  • How can we find time to have a re-writing flow if the management doesn’t know any better?

The mileage may vary, of course, so several people pitched in and we didn’t draw any firm conclusions, only presented ideas to try, which was interesting.

In the end, we came out with some good ideas on how to keep code from stagnating… hopefully our future selves will have ever fewer messes to deal with :^)

Retrospective: Reasons why you should love emacs

Last Saturday I was at Dominican College O&M’s campus at La Romana, as one of the speakers for the “OpenSaturday” series of events.

This was a complete success: full-room, engaged audience, excellent speakers.

I had the opportunity to give my first talk on emacs.

It was delivered using org-tree-slide-mode for one part — which was really cool for the audience and for me, too.

On the second half of the presentation, I used org-mode and demonstrated custom “ToDo” states and timestamps, org-mode html export (C-c C-e h H), syntax highlighting, Immediate mode, and Emmet mode. Of course, I demonstrated having multiple buffers open in several windows at the same time.

It was all in a hurry, because it was a 10-minute talk; I couldn’t demonstrate the keyboard macros – which would’ve been nice, as I was going to demonstrate an extraction of html form item names and the generation of php code to get it from the $_REQUEST superglobal; this makes use of emac’s ability to use search and all functions as part of the macro, which I know for a fact several editors can’t do.

The show-stealer was Emmet mode – I actually thought people would’ve been more surprised at noticing that the presentation was within emacs, but they weren’t. As many are CS students who are learning HTML, seeing html>(head>title)+body>div#content>ul>li*5 grow into the corresponding tree blew them away.

I’m planning to enhance that presentation to fill a 45-minute slot featuring keyboard macros, elisp functions + keybindings, and select parts of my .emacs file. Perhaps the presentation will be accompanied by a “Reasons to love Vi” by one of my colleagues, which would be sweet.

In any case, a great Saturday – hopefully things will keep on being fun.

Notifications

I usually don’t have internet on my phone unless I’m home.

I started playing a recent freemium game aiming to be an e-sport.

The game has offline notifications.

This decimated my capacity to properly concentrate, little by little. I can now appreciate the reason many decry this as the era of distraction, of people looking down at their phones all the time. Not to misrepresent my stance: I’d noticed people walking around carelessly, too concentrated on their phones; it’d not really worried me, as once upon a time I did that, only carrying books around instead of a phone.

What I hadn’t noticed is the barrage of notifications, at seeming random times; the concatenation of stimuli in near-random intervals that create a tick-like habit of constantly checking the phone, fomenting a mental nag.

Any kind of worthwhile work – which is any work not to be automated just yet – takes time and focus; it probably needs introspection and analysis. Otherwise, it should be automated as much as possible until we find the need to analyze and meditate anew.

In any case, I’ve since uninstalled the application, which had no option to deactivate the offline notifications. I’ve also uninstalled a few other applications, which I liked to use from comfortable places at my home. And I took some days to cleanse.

Now that I’m not twitchy with check-my-phone-itis, I feel a lot better, and will keep posting – because now I can think through stuff, and remember better.

In case there’s any doubt, I’m saying that constant nagging severely impaired my focus, productivity and overall quality of life. You may be affected, too. Run an experiment on living without constant notifications running you, and see what happens. As for me… well, it seems not to be my style at all.

Hyperbolic Discounting

There are many ways to observe and explain the behaviors I described in the previous article. Hyperbolic discounting is a way to think about the issue.

On the one hand, we have the thought that people tend to push things they don’t like further in the future, which is basically what thinking about “low time preference” gets you.
On the other, thinking about hyperbolic discounting allows you to analyze the tendency for people to choose what they’d like now, even in exchange for things they’d not want later. In other words, we tend trade “good times now” for “bad times later”, such as partying hard in exchange for feeling beat, cranky and in pain tomorrow.

This may have useful insights to apply when trying to calibrate time preference. Perhaps reframing the future situation or doing a mental excercise that allows us to feel as if the bad times are going to happen before the good times do, we’d be able to help balance our decision making tendencies.

There are some good reasons for hyperbolic discounting – such as increasingly lower certainty about the outcomes when the chains of event are spread on very long stretches of time.

In any case, today I ran a tiny experiment to improve my productivity without varying my time preference – I killed all the distractions and sat down to work. It was good for most of my work session, but I was filled with anxiety that things may be happening which were important for me, and I’d not find out because I got disconnected. I’ll keep the experiment running for a few more days and see what happens; I expect that I’ll get used to the pattern and stop feeling anxious.

I still got more done than usual, which is nice. I hope the pattern continues.