Home Resources Blog

Here at Campaign Monitor we’re continuously on the look out for ways to improve the way we work in every area of the business. Fortunately for us, our industry is full of great companies who share their ideas and experiences freely. So here’s our contribution to that ecosystem of experience reports. This is how we make Campaign Monitor.

It’s difficult to describe our exact development process because it is constantly evolving with each release. Because we have short releases, we can easily try out new approaches and refine what works for us and discard what doesn’t.

My job in the Campaign Monitor kingdom is to test and analyze the product. Given that I’m part of the test team, you may notice a slight quality assurance flavour to this overview.

Each release cycle is typically between 6 and 8 weeks long, but some larger features have been known to take up to 8 months. We keep our quality goals high and our release dates flexible. So if we think the quality isn’t up to scratch, we just move the release date to allow ourselves time to get it right before we ship.

Our development team is currently made up of two designers, six developers and three testers. We also have two product owners and the global support team, who provide input into many of the features. Sometimes the entire team will work on the same release, and sometimes the team will work on two or more releases side-by-side.

How we collaborate

The only regular development meeting that we hold is a weekly meeting where each team member says what they’re working on and how it’s progressing. This will often also include a demo of something that a team member has been working on.

The whole team keeps track of tasks and bugs using JIRA. The open task list, plus the stand-up meetings usually give us a good indication of how close we are to being able to release. The test team maintains a hand-written “dashboard” in the lunch room, showing the current quality state of each feature area. The test team also sends out a weekly campaign newsletter, updating the whole company on the current release progress. As our releases are very quality-driven, testers are well-placed to help give an idea of how close we are to a release date.

As we each work in closed, private offices and don’t have many meetings, communication has to find other mediums. Our most preferred method of communication in the development team is a visit in person to someone’s office. For quick questions, we’ll use instant messenger. If we need to talk to the whole team, we’ll use Campfire, which is a real-time chat program. Or we’ll simply save it for the stand-up meeting.

Our whole office eats lunch together every day, and it’s not uncommon to find team members having discussions in the kitchen or even over a game of ping pong. So we talk to each other all the time and as a result we really don’t find much of a need for meetings and emails.

Design first

We take a design-first approach to development. What this means is that the design will get nutted out first by the design team and the user interface will be coded during this process. As our designers are also front-end programmers, creating and modifying a design concept can be pretty quick. Much of the design planning is done in a chatroom that is public to the whole team, so any other team members can see what’s being done and comment on the design.

When it’s ready, this front-end code is handed to the developers along with a very brief specification document that explains things that aren’t obvious in the design itself.

So instead of doing huge amounts of planning up front, we usually keep it pretty light and then dive right in to see what works. Ideally most or all of the back-and-forth between teams happens in the design phase, where there is little cost to design changes. It’s a very iterative process where the design is continuously evaluated, improved, implemented and tested until what we think is the best experience eventually surfaces.

Build and test

How do we manage this iterative process without degrading code quality and causing huge delays to the release? Well, there are a few things that help. With regards to code quality, we have a few automated processes in place that make monitoring this a bit easier. We use continuous integration software and run unit tests against every check-in. Then a “smoke test” is automatically run against each build. The smoke test is a small suite of GUI-level tests that checks the most basic functions of the application to see if anything is broken.

For things that are simple to check, we use automated tests. This means that we write scripts to check things for us automatically. It’s like having a robot that can we can program to test the same things every day, faster than we ever could if we did it ourselves. We have several automated test suites, checking various functions at a unit level, at the GUI level and via the API. By monitoring the failures from these tests, we can quickly be alerted to bugs in existing areas of the application while we’re busy working on new features.

In addition, we have a test team dedicated to testing new and existing features throughout the entire development cycle. All of this allows issues to be quickly reported back to the designers and developers as they are developing features.

The other thing that helps is that we are so light on process and documentation. If something needs modifying, we don’t have to go through any tedious request and approval systems to get things done. These kinds of processes just aren’t necessary for us because we’re not restricted by client requirements, budget constraints or rigid deadlines. We have the luxury of being able to take this less formal approach to changes in product requirements.

One more thing that really helps is that our team members have very broad skillsets and aren’t restricted to specific roles. The designers are also front-end programmers, the testers can cut code, and everybody in the team helps to test the product as it’s built. Since everyone is from a technical background, we all speak the same language. The whole design to testing process may only involve three people, which means it’s pretty quick and easy to go back to the start of the process and re-design, re-code and re-test as necessary.

Ready to ship

Our features are ready to ship when our team is happy with them. We’re all very proud of this product, so this happiness metric can be a pretty challenging target to meet. When everyone on the team says we’re ready to ship, we’re ready to ship.

So, that’s how we do things around here. We’re always striving to reflect and improve, so we’d love to hear your thoughts. And if you’d like to know more about the way we do things, please leave us a comment.

Want to join our test team?

If this sounds like a way that you’d like to work, we’re currently looking for a Test Engineer to join our team in Sydney. Read a little more about this role and we’d love to hear from you if it sounds like a good match.

This blog provides general information and discussion about email marketing and related subjects. The content provided in this blog ("Content”), should not be construed as and is not intended to constitute financial, legal or tax advice. You should seek the advice of professionals prior to acting upon any information contained in the Content. All Content is provided strictly “as is” and we make no warranty or representation of any kind regarding the Content.
Straight to your inbox

Get the best email and digital marketing content delivered.

Join 250,000 in-the-know marketers and get the latest marketing tips, tactics, and news right in your inbox.

Subscribe

See why 250,000 companies worldwide love Campaign Monitor.

From Australia to Zimbabwe, and everywhere in between, companies count on Campaign Monitor for email campaigns that boost the bottom line.

Get started for free