Read my book

I wrote books about Webpack and React. Check them out!

Saturday, June 29, 2013

Mankees - update_deps and init plugins

Monkeys by Trey Ratcliff
(CC BY-NC-SA)
If you follow this blog, you might remember my recent post about mankees a small scripting platform I developed. As it happens I've done some additional development on it. The terminal client is more or less the same. The real innovation has happened on the plugin side.

In order to organize my work better, I defined an organization at GitHub. It contains all mankees development and acts as a hub of sort. Way better than going through my massive GitHub profile. It really starts to break down as you accumulate lots of small projects making it difficult to find the interesting stuff.

Basic Idea - Recap

But I digress. Back to the topic. As I was saying before, I've done some development on the plugin side. In order to make the plugins simpler, they are now simple Node.js modules. Effectively this means all you need to do is to set up an index.js that has module.exports containing the entry point (function) of your plugin. The template highlights this idea well.

To make it possible for mankees to discover your plugin, you need to have it at your ~/.mankees directory. After that you should be able to see it by invoking mankees -l. To invoke it, just pass its name to mankees like this: mankees template 1 2. You may access those parameters at the plugin entry point.

update_deps plugin

I don't like to use semvers in when developing services. It is way too easy for things to get broken as some Node.js packages get updated. I know that in principle that shouldn't happen if everyone is doing the right thing but better to be safe than sorry. That's why I like to lock down versions in a strict way (ie. 0.2.1).

update_deps plugin supports this workflow. It simply fetches the versions of the project dependencies and then puts those to package.json. After that it's up to me to continue. This is something that's quite painful to do manually and the plugin allows me to do the same thing literally in seconds. Not bad!

init plugin

Starting a project involves setting up a lot of boilerplate. You have to do things like write a README.md, set up a package.json, set up a LICENSE and perhaps something else. It's always a chore to do this.

init plugin solves this problem for me. It uses some predefined configuration that contains things like my name and then injects that data into a set of templates. Based on this simple operation I get a fresh project on which to start working on.

It is simple to write new project presets. A Node.js oriented one is included in the plugin. The system relies on Handlebars and the data to be injected is defined at ~/.mankees/config.json.

This is another good example of a plugin that simplifies life somewhat and makes it faster to author packages. You still have to work hard on the logic but that's just inevitable.

If you want to try out something way more hardcore, consider giving Yeoman a go.

Conclusion

I think it's safe to say that developing this scrappy little plugin platform has been a personal success. I keep finding myself using it. As a result I get some new ideas on how to improve my workflow. This is truly the best thing about being a developer. You can eliminate at least some of those boring parts and focus on the more interesting ones quite easily.

Wednesday, June 26, 2013

Plumbing in Node.js or How to Get Started with Streams

According to an article titled interestingly as "Are coders worth it?", web developers are just sophisticated plumbers. That's a fitting analogue for the topic of this post, streams that is. Streams are a concept originated from the early days of the Unix operating system. If you open up a terminal, you can grasp the power of streams.

Sources, Filters, Sinks

Pipe by darwin Bell
(CC BY)
The idea is very simple. You have sources, filters and sinks. Sometimes people might use a different naming scheme but this is enough for my purposes. Consider the definitions below:

  • A source yields data. This could be text, video, audio or whatever you can think of.
  • A filter accepts data, transforms it and passes it further as an output.
  • A sink accepts input and converts it into some concrete output.

Using these three concepts you may build chains that perform series of transformations on data and output it. Generalized further you may define the transformation as a graph. As a result you end up with a dataflow architecture. If you have ever used a graph editor, say for video or audio, you know what I'm talking about.

The power of this type of scheme lies in the fact that even though the individual parts may be quite simple you can achieve powerful results by combining them.

Examples

I will give you a couple of simple examples based on my usage next:

  • pbcopy < file.txt - Copies the contents of file.txt to OS X clipboard
  • pbpaste > some_file.txt - Pastes the contents of clipboard to some_file.txt
  • james | ./notify.js - Pipes output of "james", a build tool, to a notification script that displays the output on OS X notification center
  • james > /dev/null - I could not care less of the output. Let's allow null to chomp that.
There are more of these type of tricks available in this brief Unix streaming tutorial. I've included the source of ./notify.js below:
#!/usr/bin/env node
var notifier = require('node-notifier');
var stdin = process.openStdin();
 
 
stdin.setEncoding('utf8');
stdin.on('data', function(chunk) {
    notifier.notify({message: chunk});
});
It is a simple script that uses node-notifier to stream the input. node-notifier uses terminal-notifier. In fact if terminal-notifier had been designed right, it would accept the stream straight away. It is a good idea to follow the standards when writing terminal tools as it allows other people to build upon your work more easily.

Conclusion

I hope this article gave you some idea of the power of streams. In case you wish to use them in your Node.js development, check out Five Minute Guide to Streams2. It explains the Node.js API in further detail. You'll find some familiar concepts there. substack has written a fine guide on the topic as well that goes even deeper.

A while ago we had a local event in which we discussed Node.js streams. As a result some interesting material was born. Check out node-nyanstream and streams-dojo in particular. Big thanks to Esa-Matti Suuronen for compiling those.

If you like adventures, you might want to check out substack's stream-adventure. It comes with a series of small tasks you ought to solve.

I am very interested to hear how you use streams in practice. Are there specific tasks for which they are particularly suitable?

Tuesday, June 25, 2013

Mankees - a Scripting Platform for Node.js

Isn't it annoying to do certain boilerplate tasks such as bumping version number? As programmers it's our duty to eliminate boring tasks as that allows us to focus on the more interesting ones. Traditionally these sort of things have been sorted out by writing scripts (bash, etc.).

Unfortunately I'm not that great at bash scripting and I cannot even begin to comprehend how to do something simple like modifying package.json using it. It's way easier for me to do these sort of manipulations using Node.js as there's native support.

libtemplate.js - Library Template

I solved that version number problem earlier at libtemplate.js, a template library of sorts. It's a library on top of which you can write a library of your own. How meta.

It didn't feel right to include the script in the project. On the other hand I didn't want to pollute my bash namespace with it by putting a yet another script to my path. And besides, I've always wanted to write a little package manager of my own.

mankees - mankee see, mankee do

Crow and crow by antiuser
(CC BY-NC-SA)
I went ahead and wrote one. It's known as mankees. Mankee see, mankee do you know. Each mankee (plugin) performs some specific task. There's a small wiki based registry. The tool simply acts as a wrapper that makes it easy to install and execute these tasks. That's all.

If you ever wanted to write a package manager, say for frontend JavaScript, having a look at the implementation might give you some idea. On a more serious note please avoid doing that. We have enough of those already.

Conclusion

There aren't too many plugins available yet. Just a template and a tagging one. As I come by repetitive tasks I intend to turn them into mankees. If you happen to have specific ideas, let me know. Perhaps we can shape the tool into something more than a curiosity.

Thursday, June 20, 2013

How to Upload Canvas Data to Server?

In a recent assignment of mine I had to deal with HTML5 Canvas. I built a simple editor. The problem was that I needed some nice way to upload the data to the server. As it wasn't as easy problem as anticipated I thought to write this post. Perhaps someone else struggling with the same things will find it useful. :)

The simplest way to upload Canvas data to the server is simply to encode it using toDataUrl and then pass it as a part of your query string. After that you might want to decode it again on the server again and store somewhere. This approach might be alright for small images but I do not recommend this approach for bigger ones.

You are better off using FormData and Blobs. These are quite recent additions but seem to work well in modern browsers. The biggest advantage of this approach is that it allows you to treat your Canvas data as if it was a regular file. You also avoid having to encode the data at the server side so that's a nice bonus. And it is possible to implement progress bars and all that jazz on top of this.

Unfortunately certain method we need to make this work, namely Canvas.toBlob, isn't that widely supported yet. Especially the lack of support in Chrome is problematic although Firefox seems to work just fine.

As it happens, there's a way to work around this issue. The fix might not be that nice due to the amount of processing required. Still, it's way better than nothing. We simply need a custom function to convert data URI to blob. I came by a solution at Stack Overflow. It required a bit of tweaking as BlobBuilder was eliminated from the specification in favor of Blob. I have the fixed version below:
function dataURItoBlob(dataURI) {
    // convert base64 to raw binary data held in a string
    // doesn't handle URLEncoded DataURIs - see SO answer #6850276 for code that does this
    var byteString = atob(dataURI.split(',')[1]);

    // separate out the mime component
    var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0];

    // write the bytes of the string to an ArrayBuffer
    var ab = new ArrayBuffer(byteString.length);
    var dw = new DataView(ab);
    for(var i = 0; i < byteString.length; i++) {
        dw.setUint8(i, byteString.charCodeAt(i));
    }

    // write the ArrayBuffer to a blob, and you're done
    return new Blob([ab], {type: mimeString});
}
EDIT: Updated to use DataView.

Note that there's a full, working example available now! It relies on Node.js backend but you can replace it with something else if you need to.

Tuesday, June 18, 2013

CDNperf - CDN Performance in Numbers

The big day has come. It's time to reveal CDNperf. The service has been designed to provide a quick glance at various JavaScript CDN providers. Before this it has been very difficult to get an idea of their performance. Not anymore.

Besides providing this sort of service we aim to extend CDNperf to contain web performance related content. It is one of those topics that has been scattered around the web. Perhaps we can do something about that.

We have had the site in a beta for a while now. I am not saying it's entirely complete yet but it should be okay enough for casual usage or a glance at least. In case you are interested in the technology used, have a look at the GitHub repository.

This is a project that simply would not exist without our partners. It started as a collaboration between us JSter guys and Dmitriy Akulov of jsDelivr. We simply wanted to create something together and this lead to an idea of a CDN performance site.

As measuring performance can be difficult and requires global infrastructure, we contacted Pingdom. Thankfully they agreed to sponsor our efforts. To make it all work we also needed a host. BlueVM answered to the call. Big thank you!

Even though CDNperf might not be technically the most complex project, it proved to me that you can get things done with a bit of collaboration. It would have been next to impossible for me or Dmitriy alone to build a service such as this. With a bit of persuasion and hard work we managed to do it.

I hope you enjoy using the service. In case you happen to have any specific ideas on how to make it better, do let us know at the comment section.

Saturday, June 15, 2013

Thoughts on Devaamo Summit 2013 and treweb

Guess what, it was Devaamo Summit time of year again! This time it was a double event as they organized treweb, a web focused unconference while at it. I visited the conference the first time last year and it was great fun. For some reason there were less people around this year but it was still quite nice. It's all about the quality as you know.

SurviveJS - Redux

I posted a talk proposal to the Summit at the end of April. That was direct continuation to my previous talk at AgileJkl. The idea was to prepare something shorter and more focused. As I didn't hear from the guys after that, I thought they ditched the idea. You can bet I was surprised when I discovered my name on the official program yesterday.

As I did not want to disappoint the organizers, I decided to wing it and use my AgileJkl slides in my 30 minute slot. Overall the presentation felt nice (from this point of view) and I think I might have gotten a few points through. I made the whole more practical by showcasing actual implementations and tools. It felt like a good idea to get the audience involved a little bit.

Somehow the stage was easier this time as the audience was on the same floor level as you and it was very easy to question them. Maybe it wasn't that bad a thing not to prepare too much after all. Anyway, it was yet another good learning experience. I think caffeine added some extra buzz to the whole.

In case you are interested in the associated material, check out the companion site of the presentation at Survive JS. It delves way deeper in the topics discussed and should provide some food for thought for the technically minded.

Thoughts on Presentations

Overall the presentations I saw were at least adequate quality. Yes, there were some perhaps more boring ones than others but that's just fine. At least the short slots more or less force you to get to the point eventually. I know mine would have been more effective had I trimmed it. If I get to present the ideas again, I just might do that.

Jolla - Keynote

Marc Dillon, the former CEO of Jolla, gave the keynote of the conference. As you might already know, Jolla is Finnish for dinghy and is a company that spun off Nokia. Quite apt naming, eh? They're building on the legacy of MeeGo and aim to shake the mobile world in their own way. Who says the lightning doesn't strike twice in the same spot?

Android and Fragmentation

Marc made some interesting points about how Android is become more similar (less variance between vendors) and the fact how walls are erected between Google and the developers. I am actually not sure whether making the Android ecosystem more homogenous on the whole is a bad thing. At least in my mind the current state of fragmentation that Google needs to take seriously as it impedes the value of the ecosystem.

Android is a very difficult target because of this. There are simply so many possible configurations around it is difficult to test everything. Contrast this with something simple like Apple's iOS and you understand why it is such an issue.

The Other Half

I was initially somewhat skeptical about the concept of "other half" introduced by Jolla. Marc alleviated my concerns a little bit. Now I understand that it's more than just a gimmick such as Nokia's Xpress-On (N79) which could manipulate the system theme based on the cover inserted. Rather it's a hardware extension point for the whole phone.

Imagine if you could attach some new sensors or perhaps a QWERTY keyboard to your phone. The concept seems to allow this. I know these sort of things are still very much vaporware. Still, you have to admit there's some idea. We shall see whether it's enough.

Sailfish OS

Marc didn't go into the specifics of Sailfish, the operating system they are developing. It was glanced at a high level. Basically he stated it is more open than Android and given that the business starts to grow, they'll open it more. He did make the point that it is essential for a company to retain at least part of the secret source to itself even if it dealt with open source.

I think this might be something that's more critical for companies such as Jolla. They cannot afford to allow some other company to replicate their whole business. My guess is that the secret bit is there just to counter this scenario.

The Future of Jolla

It seems like Jolla tries to position itself somewhere between hardware and software. If I understood correctly, they want to be more like a community shepherd and enable growth on top of their system. Eventually this should lead to them working as a licensor so that bigger companies, such as Samsung, could build on top of their technology.

It remains to be seen whether they can reach some critical mass needed for this type of strategy to work. Especially the Samsung case seems intriguing. Why would Samsung license Sailfish rather than to focus on some of their own initiatives? What value could Sailfish possibly provide Samsung could not otherwise reach? I hope the Jolla guys have a good idea.

Enabling Open Data Communities

Jarkko Moilanen of Avoindata.net discussed the topic of how to enable open data related communities to emerge. We're still pretty new to this concept of open data here in Finland and are just starting to really open up our data reserves for the public. Jarkko discussed some of the challenges involved.

The Need for Vocabulary

Interestingly one of the main challenges seems to be finding the right Finnish vocabulary to discuss the concepts related to the topic. This sort of work is required so that things may be communicated effectively in the right places for progress to happen.

The Value of Open Data

Open data provides clear value to commerce. Unfortunately opening data is not that simple. Besides money it takes certain amount of political will and resolve. The site, Avoindata.net, developed by Jarkko aims to ease this process. It seems to have gained some traction already.

Monthly Meetups

Besides maintaining that Stack Overflowish site, they've arranged monthly meetups for around half a year at the time of writing. The meetups have been a great success. Interestingly it has lead more data to become available. Deadlines motivate even government people it seems.

The meetups foster interchange and enable new ideas to emerge. The format seems quite simple. There are just a couple of short presentations and some working together. Unfortunately it seems the time available isn't enough to reach concrete enough results. Currently they are thinking of arranging a hackathon type of event to remedy this.

Other Thoughts

It seems there's some seriously good buzz on the area of open data around Tampere. It seems they understand the value it provides for the local business and are willing to work for it. I can only hope the local powers that be at Jyväskylä understand the same thing very soon as well. At least there are some signs of that already.

HSL Navigator

Tuukka Hastrup gave a talk on HSL Navigator, a web based concept for bus passengers. We have navigators for cars already but what if we had one for these guys as well?

If I've understood correctly, HSL takes care of the public transportation around the metropolitan area (Helsinki, whatnot). They've made some great progress in opening data. In 2009 around 30 apps had been built on top of it. In 2012 the figure was 670. That's a massive increase and shows the power of developers.

Tuukka showed a visualization I found very interesting, a journey time map. You simply provide your current location and it shows on the map how long it takes to take from that spot to nearby areas. There's some sophisticated math behind that.

I'll get back to this topic at my summary of the last talk as it was related to this one.

Traffic API Developer Sandbox

Tero Piirainen of ITS Factory discussed what they've been doing with traffic data over at Tampere region. One of their primary objective seems to be to generate business by making traffic data available.

There are many sources. These include road traffic, geodata and public transport to mention some. Currently the data that is available is often in non-standard, proprietary format. They aim to change this and want to use open, standard formats instead.

As I said earlier, they seem to have some good buzz going on at Tampere and this presentation confirms it. They've acknowledged the value and seem to be willing to work for it.

Firefox OS

Thomas Rücker of Tieto gave a talk on Firefox OS. To be more accurate it was more of a talk of a talks given multiple slidesets were used. All I know about Firefox OS before this talk was that it was web based. The talk clarified some details.

The platform itself is very affordable, somewhere around 100 euros. Besides buying a new phone, you might get it to run on your current Android although apparently there may be some issues with resolutions. But there are supposedly hacks for that.

The nice thing, depending on your viewpoint, of the platform is that you get to do everything in JavaScript. There are hardware level APIs available natively. As you might not want to give everyone at the web access over your phone, there is some kind of an elevation based security model included as well. You can decide not to allow that nice app to send SMS messages unless you really want to.

To be honest, out of Sailfish and Firefox OS the latter one seems more promising to me at the moment. At least they've gotten out of the vaporware stage and have something concrete to show. I would not mind if it became a great success. Even as a minor success it might force some of the bigger players to rethink their approach and make web a first class citizen on their platforms.

The next sections will be shorter as they were part of treweb and more technical in nature. I've tried to pick up the juiciest bits rather than boring you with the details.

Typography ABC

Antti Mattila talked about the basics of typography. I won't go into details as you can pick up the basics at a resource such as Hack Design. I will mention a couple of core points:
  • Learn to pair typefaces
  • Don't use too small typeface. Consider poor eyesight and all that
  • Most of Google Fonts aren't that good. Use Beautiful Web Type to find the good ones
  • Font Squirrel has a nice set of free fonts available
  • Of commercial ones Typekit and Fontdeck should be nice
  • Fonts are expensive for a reason. You get what you pay for

CSS Flexbox

Janne Kallunki discussed CSS flexbox. It's one of those features that would make the daily drudgery with CSS so much nicer. It provides some well needed functionality and turns some difficult things into almost intuitive. Rather than having to fight with floats and display options your declarations become more natural.

The biggest issue with the feature seems to be poor support. There are multiple versions of the specification. This has lead to some major fragmentation between browsers. And then we have IE. I won't go into that.

It looks like it's still too early to use flexbox in production. The feature shows a great deal of promise. I think it will take a couple of years till we see it in the mainstream. Even then it will be likely with some kludges.

This, That and the Self

Antti Järvinen talked about this. I mean that. Okay, bad pun. Anyway, the talk was interesting and highlighted various corner cases. Personally I like to avoid this and too object oriented code these days but perhaps that's just me. You still have to be aware of the basic binding rules.

The biggest gotchas seem to be related to closures as they don't work as intuitively as you might expect. I remember tripping on this when I got started with JavaScript. Remember, if you use a closure and want to use this, you will likely want to use that instead. Oh, and be careful with jQuery and this. It does some strange things to it.

Web Frameworks on Cloud Platforms - Pitfalls and Improvements

Jukka Tupamäki's presentation was probably the most academic one in the track. I won't go into much detail here. I expect you know cloud is the shiz, it's elastic and everything. Even though cloud is the new hip thing, frameworks haven't quite kept up with the progress. As a result there is all sorts of nastiness around the developers have to deal with.

When you start to scale and share data and logic between servers, traditional approaches fall apart. Your cronjobs might start running multiple times and wreaking havoc on your database. Data might get stored on the wrong server. You get the idea.

This leads to macro scale changes. You should take cloud and scaleability in count on your application design. If the architecture doesn't scale to multiple servers, you will be in big trouble at some point.

I think that in practice this means that you should aim to encapsulate your service into smaller compartments with well defined interfaces. This is perhaps starting to sound a bit like SOA could be a good idea.

Offline Transit Routing in JavaScript

Juha Järvi of Busfaster continued on where Tuukka left. Offline usage is particularly important for user groups such as tourists. Nobody wants to store a terabyte of map data on their phone, though, so how to get something that's useful enough? That's where various compression techniques come in.

First of all the whole map has been sliced to smaller areas. Within each area nearby points may be merged without sacrificing too much accuracy. Dead ends may be eliminated. There are these sort of small things which you can do to drastically remove data while still retaining enough so that valid results may be attained.

Even timetables may be compressed as they contain regular data. You simply need to store some sort of an offset rather than each time. Better yet the data doesn't have to be exactly correct. This provides room for further optimizations as you afford to let the data to be a minute or two off for instance.

On interesting scenario he mentioned is that the data could be served on some cloud service the user has access to. Dropbox could be a good alternative for instance. If they can make the workflow simple enough, I guess even the less nerdy of us would find this useful.

The technology stack they use was quite conventional. Basically just Closure Compiler, Node.js, PostGIS, Emscripten and asm.js (Emscripten target). I thought everyone used Uglify these days so Closure Compiler was bit of a surprise. I guess it must have its benefits especially in a case such as this.

Conclusion

It was very nice to visit Devaamo Summit again. treweb was just a nice bonus. The venue is just great for something like this. I hope they'll have more resources available next year so they can make the event even more awesome. It's just a matter of pulling the right strings.

I had actually prepared some bonus material for treweb that would have included some live coding. Perhaps it's better to save that for some another time. :)

Friday, June 14, 2013

Node.js - a Collection of Server Configuration Patterns

There are many ways to set up Node.js server configuration. Rather than writing a blog post about the topic and having to maintain that I started a repository that goes through a couple of patterns I have used. During its development I found myself writing yet another module, namely parse-env.

parse-env allows you to support environment variables based on your current configuration scheme as long as it is based on an Object structure. It simply constructs environment variable names using some simple logic and then checks whether they exist or not. This allows you to rely on convention rather than having to define each of those by hand.

This sort of approach is especially valuable in environments, such as Heroku, where you have a limited amount of control over the server. In case of Heroku you are pretty much forced to deal with environment variable based configuration as nobody in their right mind would commit configuration to the repository.

Let me know if I'm missing some vital configuration pattern and I'll look into adding it. Even better, poke me with a pull request and I'll be really happy. :)

Friday, June 7, 2013

Linkdump 14 - Business, Startups, Software Development...

To save some effort and make the dump more useful, I'll compile one now. I did one a month ago and have accumulated some more or less interesting content since. Enjoy the findings below!

Business


Startups


Personal Development

Software Development


Web Development


Tuesday, June 4, 2013

Watercolor - a Challenging Yet Rewarding Medium

I have a dark secret. I actually like to draw and paint. Those are not very programmery things to do. But there, I said it. You have to find your fun somewhere. :)

Painting is something I picked up just recently. Before that I've spent some time doodling every once in a while. I've spend some time on studying constructive anatomy (Bridgman) and have some idea of how to compose pictures. I suppose this is something that might help in web design.

During the last Winter/Spring I enrolled to a painting course. There was some charcoal work involved. Charcoal is interesting as it's a medium that allows you to both draw and paint. It is a very flexible yet demanding medium. And a very affordable one. Anyhow, it's good preparation for real painting.

The last time I touched watercolors before that was on the last millennia. I recall the results were not that great but I think it might have had something to do with the quality of the equipment. You don't need that much equipment but be sure not to skimp there too much. You can get a decent kit for around 50 dollars/euros with some starting papers and all.

Watercolor as a Medium

My toolkit
Watercolor is a challenging yet affordable medium. You can get started with two good brushes (small and big), a set of colors (a cheap Yarka one will do), a sponge (borrow a makeup sponge) and a water container (you can cut a milk carton).

Brushes come in many flavors. It's a good idea to spend a bit on quality to make the process less frustrating. That biggest one on the picture is actually a cheap but interesting brush known as hake. It's a brush popularized by Ron Ranson. I've yet to master it.

Besides these basic tools you'll need some paper and tape. Be sure to pick heavy enough paper (at least 180 gsm). That's another way to avoid frustration as it allows you wipe some of the color off using a sponge and overall allows you to be rougher while working. There are also techniques such as "wet on wet" that may be implemented on heavier papers.

If you want to end up with an even end result, you might want to consider using a specific kind of watercolor tape rather than regular one. It takes some extra effort to set up but it's worth it especially if you wish to frame your painting.

Watercolor + ??? = Awesomeness

Pencil study (~5 mins)
What makes watercolor interesting as a medium is the fact that you can combine it with some others. You could for instance create an underdrawing using pencils or waterproof markers. There are also specific kind of pencils which dissolve when touched with water.

You can also work on the painting while it's still wet and draw on it. Stabbing is also allowed (knife is useful!) and may lead to interesting effects. Or you could sprinkle some salt on it. There are no limits.

Painting Process

Before I start to paint, I usually prepare a preliminary drawing in which I examine basic shapes and their relations. Somehow it makes it easier to focus on the difficult parts, like color, while painting. In order to make it easier to deal with color, I like to paint a small color study on a card.
Color study

It's a good idea to paint a few paintings from the same subject in a row. You can even work on them simultaneously while waiting some parts to dry. Watercolors are not for the impatient.

If you manage to botch some, it's alright. Sometimes accidents can actually be a good thing and add a great deal of interest to your painting. And if it's really horrid, well, that's why rubbish cans were invented. But before scrapping the work, consider painting on the other side first. :)

The painting you can see below was from a series of four I painted within an hour or so. In this case I was in too much of a hurry to spend time on underdrawings. Sometimes it's just better to do than to think too much.

I recall spending around 15 minutes on this piece using mainly my big, sharp-edged brush. I like particularly how the bottle turned out. Interestingly each painting had a character of its own even though the subject was the same.

Framed result

Resources

I know reading this post didn't make you an instant master of watercolors. But I hope it at least provided some inspiration. I have a great deal to learn myself. As you learn one thing adequately, you'll find challenges elsewhere.

WetCanvas, a popular art forum, has a nice thread that might be worth looking into. You'll find some beginner and even advanced resources there. The following blogs might provide some ideas as well although their scope is way wider: