What I learned from three years of FLOSS projects development

It’s almost a tradition, like the two previous years (1, 2), I share my feedback about a year of FLOSS project contributions! Everything began on October, 19th 2010 (I know I’m a little bit late) while I started the Newebe project – a distributed social network. It led me, eighteen months later to Cozy, a startup that distributes FLOSS personal clouds. This year, my feedback will be mainly related to the code we produced at Cozy and focused on three subjects: development, tech talks and community.


I changed the way I code. I no longer look for the best way to code something but for the most comprehensible one (I don’t try to refactor everything, I make variable and function names explicit…). Fortunately, concise code fits well with that, so my code stays clean.

Automatising is boring but it will make you happy. I wrote more bash aliases for local development and fabric scripts (thx fabtools!) for server management. And you know what? It really makes my life easier! I spent a lot less time on repetitive tasks (this link will give you a framework to find good alias to add) and I can focus more on exciting ones.

I forced myself to use vim as it should be. First, I blocked my arrow keys to navigate only with h, j, k and l. Then I looked for more shortcuts and changed the bindings I didn’t like. At the beginning it was painful, but today I code mucr faster and I do less useless gestures. At last, with vim, it is very fun to always look for efficient key combinations.

I discovered that git workflows based on a lot of branching are not well suited for every projects. Maybe they perform well for big projects or projects with a lot of versions, but Github fork workflows look good enough for most of the others. I fully embraced that idea when I read the chapter 6 of the ZMQ guide: they suggest to keep only a master branch on the main repo while the “work in progress” stays in the contributor forks. This way you keep your main repo branches clean (there is only one!) and it makes easier for new comers to start contributing.

Making the code modular really helps the collaborative works. You can write on a part of the code without annoying the others. And don’t be scared about module communications, most of the times module interactions are not so complex as expected.

About my first project, Newebe, I bit off more than I can chew: I decided to rewrite fully the User interface. Because I didn’t have much time anymore to spend on this project, my progress was slow, I felt discouraged and the work is still not finished. Keeping a small steps planning for a project on which you can’t work a lot is better suited.

Working on Libre and Open Source projects makes me really think more about collaboration. I discovered too that simple open tools can be really more optimized and that time management cannot be neglected even when I code for fun.


Tech Talks

WIth Cozy, I have had several opportunities to give talks about the project or about related technologies. Moreover, we were part of the Mozilla WebFWD accelerator program. Among many other things, they taught us the art of pitching. So today, I share with you good techniques I learned regarding tech talks.

You should not put too many new concepts in your talk: it will make it harder to follow. It’s easy to fall in this trap, specially with stuff on which you work on. They look very common to you but this is probably not the case for your audience.
ex: when you read a lot of articles about SASS, that doesn’t mean everyone knows what a CSS pre-processor is.

At Mozilla, they made us do an interesting exercise : write the content of your talk depending on the audience expectations (write down the questions your audience will probably ask by listening to you). During a talk you are there to bring something new but you also have to answer to the related questions. The Q&A time is not enough for that.

During most of my talks, I felt that people wanted to ask me question early on. So, if you have the opportunity to put a Q&A session in the middle of the talk, do it! Talks are often too unilateral, letting the audience to speak will make your presentation more dynamic and will bring additional details to your subject.

Speaking louder help to modulate your voice. Modulation help you to emphasize the most important parts of your talk. This is true for the flow too, when you manage well your rythm it will make your speach easier to understand. Beware of microphones that won’t allow you to speak as loud as you would like.

Talking to an audience is a little bit like playing music in front of an audience. When you miss something (you forget words or details, have an embarrassing hesitation), go quickly to the next idea like nothing happened. On the opposite, always be very well prepared for the introduction and the conclusion.

Before doing talks, I read a lot of articles about how you should move on stage and fill the space. It’s a good point but most of the time you won’t be able to move a lot. Sometimes you cannot even move from your desk stand. So don’t expect to be able to move a lot.

Recording your talk will help you to understand what to improve. Thank to the LSM videos, I noticed that I make a lot of nervous gestures, which needs to be improved!

I had limited time to practice my talk and practising is very time consuming. So I apply a principle learned by playing music:  practice most the parts that are your weak points.

Learned at Mozilla: always put a call to action at the end of your talk. It will give a way to your audience to follow the discussion after.

Last but not least: never forget your bottle of water!

image rmll


With Cozy, we are in an entreprise mindset. So we have to make our community grow quickly. It’s a tough task because we have to please to a lot of people while keeping our vision. What I just described there is part of the role of the community manager. I don’t like the term management when applied to community, I prefer to say community facilitator. What I meant by it, is to make a community grow, you have to provide it all the required keys to take the ownership of the project.

In this optic, the most important thing I learned is that a project is often too hard to understand at first glance. It’s true for contributors and users: to allow newcomers, you have to make it simple on every aspects of your projects.

Two events helped me to fully understand that. This summer, I read the chapter 6 of the ZMQ guide that explains how every action required to start with your project should be frictionless. At the same time,  I noticed that request-json, one of our smallest project had a lot of external contributions, even more than the other projects which are much more exciting. How come? you ask, because the concept lying behind request-json is simple (HTTP client to deal with JSON APIs), source code is short, easy to read and is kept inside a single file. Finally the documentation is quick to read and is accessible directly from the README. To make it short, everyone can learn everything about the project in ten minutes.

To illustrate, here is what we did at Cozy:

  • Our documentation is more accesible and visible. Even a Github wiki requires too much effort to be found. Most people don’t think to look for it. So we put our documentation directly on the project website.
  • Code samples were written in Coffeescript. We changed that for Javascript. There are more people who understand Javascript than people who understand Coffeescript.
  • The website is clearer and it shares our objectives for Cozy (advice from Swarmwise by Richard Falkvinge). It gives a good platform to start for people who are interested in the project.
  • We simplified our forum, from a multi-section forum we move to a single section google group.
  • About the code we changed the base framework of our applications to a lighter one framework plus léger with a clearer file structure.
  • Finally, we allowed people to try Cozy via virtual images before running the installation process.

The results were good, more people posted on our forum, the number of Cozy application downloads doubled to 3000/month and we obtained a really big contribution on one of our main module. This approach works.

I will conclude this part with this advice; a modular project makes contributions easier: to work on your project, newcomers don’t need to understand all the code. Here are two good exemples of projects that already does that and works well. Weboob a tool that allow you to browse web content from the command line from different sources (social networks, youtube, banks…). Any contributor can build his own connector without knowing the core code while getting advantage of the core api. Result: hundreds of modules are available. The other project is ZMQ. They build their community around the language bindings. Main commiters are few and work on the core code while the community maintains the bindings, they just need to know the core features. Result: there is a ZMQ binding for every popular language.



I’m glad I discovered all this new things, but I have to admit that it would probably have been more efficient to start my FLOSS developer carreer by contributing to an existing project. I would have probably learn things quicker in a better context. When I started Newebe, I was urged by my problem to solve (to have an alternative to Facebook) and it gave me a lot of energy.

Voilà, that’s all for this year. I have a lot more to share but I think there is already enough in this post. So  I will stop there. Thank you for reading and see you next year!

How to make a Personal Single Page Application with Cozy

Talk performed at LyonJS, April 2013

Haibu: Personal Platform-as-a-Service

Talk performed at LyonJS, April 2013

How to quickly make REST APIs with CompoundJS

Talk performed at LyonJS, April 2013

Node.js talks at LyonJS

Since a few months, I don’t have enough time to  write on this blog. Things are getting serious with Cozy Cloud, so I’m a little busy: we have a new team of developers and they do pretty well. Consequence: I have a lot of code to integrate and some stuff to teach and show. Another Reason is that I do most of my writings on the Cozy Cloud official blog. If you came here for my Node.js articles (these blog analyics say most of the visitors came for that) you could be interested in three of them I wrote on blog.cozycloud.cc:

In addition, if you’re French, I have an annoucement ;). I will give several talks at LyonJS this Tuesday. Here are the subjects:

  • How to develop a REST API with CompoundJS
  • Hosting your own application with Haibu
  • How to build a personal single page-application with Cozy in 10 minutes
  • Work organization: agile development, telework, hipster tools and Node.js

I will share the slides soon on this blog via my Slideshare account! I hope to see you there. All informations are available on the LyonJS website.


Upload a file from a NodeJS client to an Express server

I didn’t find any simple resource about how to upload files from a Node.js app to a Node.js Express server. They all give solutions based on NodeJS native API or old Express API. That’s fine but using the right libraries will make your life easier.

So here are two littlle snippets that give you the way to easily handle file uploads with Request-json and Express. If you want to understand what happens under the hood, read the following documentations/codes : request-json, request, form-data, express, body parser, multipart middleware, formidable.

Client side :

# client.coffee
Client = require("request-json").JsonClient
client = new Client "http://localhost:3000/"

extradata = tag: "happy"
client.sendFile 'file-upload/', './test.png', extradata, (err, res, body) ->
    if err then console.log err else console.log 'file uploaded'

Server side:

# server.coffee
express = require 'express'
fs = require 'fs'
app = express()

# File uploading requires express body parser.
app.use express.bodyParser
    keepExtensions: true # optional
    uploadDir: '/my/path/upload/files'

app.post '/file-upload', (req, res) ->
    file = req.files.file

    # Express middleware gives a temporary name to the file, so we rename it.
    fs.rename file.path, "/my/path/upload/files/#{file.name}", (err) ->
        res.send error: err if err
        console.log "file uploaded. Extradata: tag = #{req.body.tag}"
        res.send success: true

request-json : simple HTTP client for Node.js to deal with JSON APIs

Most of the web APIs furnishes data formatted with JSON. Even some databases, like CouchDB, propose to manage data through HTTP requests that carry JSON data.

With Node.js, to write an HTTP client that queries these JSON API, there is the famous request library. This one is able to do a lot of stuff but requires a configuration each time you use it with JSON APIs. For instance, if you send a GET request to an URI, you will have to parse returned body or to specify a JSON option. That could be annoying if you do it frequently.

To avoid that I wrote a simple library that extend request and will make your life easier when dealing with JSON API. It is called request-json and is really more straightforward.

Here are some examples, written with coffeescript, it works with Javascript too.

data = title: 'my title', content: 'my content'

# with request
request.post uri: 'http://localhost:8888/posts', json: data, (error, response, body) ->
    console.log response.statusCode

# with request-json
client = new Client 'http://localhost:8888/'
client.post 'posts/', data, (err, res, body) ->
    console.log response.statusCode

# with request
request.get uri: 'http://localhost:8888/posts', json: true, (error, response, body) ->
    console.log body[0].title

# with request-json
client.get 'posts/', (err, res, body) ->
    console.log body[0].title

request-json brings additional features:

  • keep your base path, dont rewrite it every time.
  • send and save files easily (that’s properly done via streams)
  • add a basic authentication to each of your requests.

NodeJS Cron Job with Kue

At Cozy we needed to write some jobs of this kind for our Mails app. For that we used Kue a simple task manager for NodeJS that requires Redis to run. The documentation is great but there was no sample for writing a cron task. That’s why I share this little snippet that runs a 2 seconds long job every 3 seconds, in case you need something similar.

NB : The delay between each job is set in minutes.

kue = require 'kue'

Job = kue.Job

jobs = kue.createQueue()

# Set up server if you want to see your task progression with a beautiful UI
kue.app.listen 3003

# Your cron timing
delay = 3000

# Function used to launch a job.
repeatJob = ->
    job = jobs.create "test complete",
        title: "my job"
        info: "job is working"

    job.on 'promotion', () ->
        console.log job.data.title + " #" + job.id + " promoted"


# The job to run, a fake task that is 2 seconds long.
cronTask = (job, done) ->
    global.currentJob = job.id
    console.log job.data.title + " #" + job.id + " job started"
    setTimeout ->
        console.log "my job is done"
    , 2000

# Register job
jobs.process "test complete", myFunc = cronTask

# Check for new job every 3s, change this value to set your cron timing.

# Run the cron job for the first time.

Whoosh: full-text search with Python

To add an efficient search function to the product I work on, I was looking for a good indexer. Elastic Search, a Java indexer that is managed through a REST api, looks good but it requires to set-up a dedicated server: it’s not a library but a full software. Another option was Xapian, looks efficient, but not very well documented.

Then I discovered Whoosh, a Python library which offers indexing and search features. The documentation and the API makes it really easy to use. The performance are probably worst than the Elastic Search or Xapian but it should be enough for a lot of projects. The library provides a lot of search strategies and functionalities (stemming, faceting, highlighting…). In conclusion, if you have a Python project that requires full-text search, you should definitely have a look at it.

To illustrate this article here is a little snippet I wrote that index a list of blog posts located in MongoDB database.

import os

from whoosh.fields import Schema, ID, KEYWORD, TEXT
from whoosh.index import create_in
from whoosh.query import Term

from pymongo import Connection
from bson.objectid import ObjectId

# Set index, we index title and content as texts and tags as keywords.
# We store inside index only titles and ids.
schema = Schema(title=TEXT(stored=True), content=TEXT,
                nid=ID(stored=True), tags=KEYWORD)

# Create index dir if it does not exists.
if not os.path.exists("index"):

# Initialize index
index = create_in("index", schema)

# Initiate db connection
connection = Connection('localhost', 27017)
db = connection["cozy-home"]
posts = db.posts

# Fill index with posts from DB
writer = index.writer()
for post in posts.find():

# Search inside index for post containing "test", then it displays
# results.
with index.searcher() as searcher:
    result = searcher.search(Term("content", u"test"))[0]
    post = posts.find_one(ObjectId(result["nid"]))
    print result["title"]
    print post["content"]

How to quickly start a single-page application with Node.js

A problem I experienced while starting coding with NodeJS environment and Express framework is that I had difficulties to quickly make a well structured app. I spent too much time on organizing my modules and writing helpers. Paradoxally with Express, I like the fact that I am free to do what I want and don’t encounter too much constraints due to the framework. At last, it was harder to design front end code than back end code.

To deal with that, I used two tools that fit with my requirements :

  • CompoundJS : a lightweight framework on top of express. It offers the structure and the vital functions I need to write a good  backend and configure express properly.
  • Brunch : an application assembler to organize and build cleanly my front-end code.
  • Edit: If you want to build a small single page-app, have a look at Americano a lightweight framework easy to learn (based on ExpressJS).

The bad thing with that choice is that there are no out of the box integration. So, in this article, I will give you a way to make them work together and obtain a well structured single page-app  in a minute (see the result here). I won’t cover in this article how code is organized, I can just tell you that it is a typical MVC structure for backend and a Backbone MVC for frontend. Look at their documentations for more.

1. Generate files

I assume you have already installed NodeJS. So install needed tools via npm, the node package manager :

npm install compound -g
npm install brunch -g

Then generate backend with railway:

compound init blog --coffee
cd blog
npm install # dependencies installation

With brunch generate frontend after removing frontend stuff from railway:

rm -rf public
brunch new client
cd client 
brunch build
cd ..

BackboneJS is the MVC framework configured by Brunch by default. If you want to work with AngularJS, you should run the project creation this way :

brunch new client --skeleton https://github.com/scotch/angular-brunch-seed

2. Plug Brunch on Railway

Then configure backend to handle brunch folder instead of old ones. In config/environment.coffee Replace this line :

app.use express.static(cwd + '/public', maxAge: 86400000)

with :

app.use express.static(cwd + '/client/public', maxAge: 86400000)

or for AngularJS case:

app.use express.static(cwd + '/client/_public', maxAge: 86400000)

In base template (index) app/views/layouts/application_layout.ejs , change stylesheet and javascript links :

<%- stylesheet_link_tag('bootstrap', 'style', 'bootstrap-responsive') %>


<%- stylesheet_link_tag('app') %>


<%- javascript_include_tag('http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js', 'bootstrap', 'rails', 'application') %>

with :

<%- javascript_include_tag('vendor', 'app') %>

For AngularJS cas use these lines:

<%- stylesheet_link_tag('css/app') %>
<%- javascript_include_tag('js/vendor', 'js/app') %>

Then clear body to replace it by :

        <%- body %>

3. Make entry point and set a route to it

Create a controller to generate an entry point in app/controllers/application_controller.coffee:

action 'index', ->
     render title: "my single page app"

Then create template file for entry point:

mkdir app/views/application 
vim app/views/application/index.ejs

fill it with:


Then add a route to confige/routes.coffee file:

exports.routes = (map) ->     
    map.get "/", "application#index"

Check that everything is fine by starting server and opening http://localhost:3000/ in your browser (Brunch logo should be displayed):

compound s

4. Use

To make Brunch automatically rebuild you app after each modification, go into the client directory and type:

brunch watch

You can write all your UI code inside client directory. Brunch handles templates too, so don’t worry about writing your html code on client side. Another good thing is that Compound and Brunch offer generators to build your models and controllers faster. Finally, Compound is very efficient for writing REST API : sending and parsing JSON are easy. As you understand, now you have all the stuff you need to build an awesome single-page web application!

Feel free to comment this tutorial and share any thoughts about this.