Australian Bureau of Statistics

Reimagining the Census digital experience


The Census is the largest form of information gathering conducted in Australia. In preparation for the 2021 Census, the Australian Bureau of Statistics, under the Digital Service Standard process, engaged ThoughtWorks for the Alpha phase to design and test the online Census form with Australian users.

My role
Interaction design

User research


Product Owner
Service Designer
User Researcher

Business Analyst

x2 Developers


The ABS undertakes the Australian Census of Population and Housing (Australia’s largest statistical collection) and is conducted every five years. The aim of the Census is to accurately collect data on the key characteristics of people in Australia on Census night and the dwellings in which they live. 


The ABS undertakes the Australian Census of Population and Housing (Australia’s largest statistical collection) and is conducted every five years. The aim of the Census is to accurately collect data on the key characteristics of people in Australia on Census night and the dwellings in which they live.  

The information collected helps estimate Australia’s population, which is used to inform policy, distribute government funds and plan services for communities, such as housing, transport, education, industry, hospitals and environment.

In following the Digital Service Standard (DSS) process, the ABS brought in a small team of ThoughtWorkers to lead an Alpha for the online experience in preparation for the next Census in 2021.

One of our primary goals was to improve on the user experience of the previous Census in 2016. Our vision was to design and test a solution that would meet users’ needs, build confidence and enable them to complete the Census online, without having to seek help from the ABS.

In addition, the ABS were in the process of changing the organisation to be more user-centric. As part of this commitment, the Census team were looking for co-creation, service design mentoring and capability uplift.


What is an Alpha?

An Alpha is the phase that comes after Discovery using the Digital Transformation Agency’s Digital Service Standard approach to service design. The goal is to test concepts of the service with users, and feed this understanding into the Beta stage of the digital service development.

Alpha was an opportunity for us to find problems with our design and decide how to solve them based on evidence from user research. During this time, it was also important for us to identify the biggest risks for the Beta stage as early as possible. Our goal by the end of Alpha was to understand what needs to be built.


What we inherited from Discovery

The ABS partnered with another agency for the Discovery phase to deliver insights from a Census research study. The agency produced a report including proposed principles for the design of the 2021 Census and 15 hypotheses for a more relevant and simpler Census experience. Following on from Discovery, we picked up these recommendations to design concepts for testing during Alpha.

The Discovery covered the 'mainstream' respondent experience comprehensively, however there were several important pieces of information that were not explored or communicated to the Alpha team. As a result, we had to divert time away from Alpha work to develop new hypotheses, and start from scratch on understanding internal ABS structure, processes, technology, data and policies.

Our approach

With a blended ThoughtWorks/ABS team and agile ways of working, we took a Hypothesis-Driven Design approach during Alpha. This enabled us to describe the experiment and define the conditions of success for any potential design concept.

We set up a two week sprint cycle in which we would design, build, research, synthesise and showcase our findings back to the wider Census programme.

Using the evidence we collected at the end of each sprint, we then decided whether to iterate further in the next sprint, or focus on a different problem. 


2016 Census pain points

The 2016 Census experience was quite simple and smooth for many people, however those who needed assistance outside of the ‘mainstream’ interactions were faced with very little option other than to call the Census Inquiry Service (CIS).


The 2016 Census website crash

The CIS became overloaded both before and after the website shut down. While the events of Census night in 2016 contributed to large volumes of calls, we learned that many of the calls were due to the lack of self service options available and a requirement for a letter (log in code) to be able to complete. Other reasons included:

  • The password was single use and could not be reset so respondents had to call the CIS for a new one if theirs was lost or forgotten.
  • If respondents did not get a letter, they had to call the CIS to request one.
  • If respondents were not at home on the night of the Census and wanted to report as such, the online form directed them to call the CIS.

We tested the 2016 online form during our 1st sprint and validated some additional pain points that we would explore further:

  • Not enough information around the what, when, how and why of the Census.
  • Not clear as to how respondents’ contribution helps to shape change.
  • The form felt long, overwhelming and clunky for some respondents.
  • Concerns around privacy and security of data, particularly after the events of #CensusFail.

The respondent journey

The scope of our work was mainly limited to the online form - that is, from logging in to the Census to successful submission. It was only a segment of the broader Census service, and even then, only a segment of the online aspect of the service.

For context, this flow outlines a very high level view of the basic steps in the online form, starting with the Census landing page and ending with a receipt of submission.


The product backlog

Though our backlog grew, changed and adapted based on our analysis and research findings, these were some of the key concepts we explored and tested during Alpha:

  • Starting without a letter
  • Address lookup and unresolved addresses
  • Logging back in and resuming
  • Understanding household and family relationships
  • Reporting for others (multi-form)
  • Reporting separately (personal form)
  • In-form help
  • Reviewing and editing the form
  • Submission, confirmation and receipt
  • Autosave and progress
  • Opting in news and updates

Visualising concepts

In preparation for the first iteration of our prototype, we took a collaborative approach and ran a co-design workshop where the whole team began visualising the highest priority hypotheses on paper. This allowed us to visualise and communicate ideas quickly, but also focus more on the problems and less on the aesthetics or technology. These sketches then served as the foundation for the initial screen designs that we took into user research.


The prototype

I used Sketch and InVision for the screen designs which were then built into a responsive prototype in code by the developers. There were around 14 iterations of the prototype, each of them reflecting our learnings from user research in the previous sprint. We used the Australian Government Design System and built new components for more customised parts of the prototype. By the end of Alpha, it featured around 87 questions.

Disclaimer: this design was for research purposes only and does not necessarily reflect what the 2021 Census will look like.

Design considerations

Accessible and inclusive
It's critical that the form can be used by as many people as possible, which is why the final solution needs to be clear and simple enough so that most people can use it without needing to adapt things, while supporting those who do.

Structure the form to help respondents
The form is huge, so we wanted to help respondents get into a rhythm of answering and let them focus on the content rather than how it's presented. We used decision branching so respondents only had to answer questions that were relevant to them.

One question per page
This pattern allowed us to split up a complex process into multiple, smaller chunks and reduce the cognitive load on the respondent. We found this approach was well suited to mobile devices, heavy validation, decision branching and saving respondents' progress.


Generate a login code

Problem: a respondent is unable to start their online Census because they did not get a letter.

Solution: ability for respondents' to start their Census immediately by generating their own login code.

Back-up option for login

Problem: a respondent needs to resume their Census but has lost or forgotten their login credentials.

Solution: ability for respondents' to create their own password and set a back-up option.


In-form help

Problem: a respondent does not understand a question, how to answer it correctly or why they need to answer it at all.

Solution: in-form help explaining why a particular piece of information is required and how to provide it correctly.

Confirmation screen

Problem: a respondent wants to know whether they have answered all of the questions or where they're up to.

Solution: confirmation screen to allow respondents' to review their answers and show when they are done.


User research

During Alpha. myself and other members of the team spoke with over 100 respondents face-to-face in Melbourne, Canberra and rural NSW to find out what works and what they need to do their census online.


Wagga Wagga – one of the regional cities we visited for research

We needed to ensure the service was not only accessible and easy to use, but inclusive of the entire population, such as people experiencing homelessness, tenants in share houses and secure apartments, remote Aboriginal and Torres Strait Islanders, remote and transient populations, people with disabilities, Culturally and Linguistically Diverse groups (CaLD) and the older population.

Each sprint, the team spent 1-2 days conducting user research with a specific cohort or participants. Most of the time, we used an agency to recruit and schedule participants based on the desired characteristics we provided. On other occasions, we sourced participants through personal networks, or through organisations with which the ABS had established relationships.

This was not cognitive testing as is often undertaken by the ABS. Our goal was to test hypotheses about various concepts and learn more through both discussion and observation. We focused on testing design concepts rather than usability.


Research participants


Participant cohorts


Research locations


Hypotheses tested


Overall, Alpha was a success and our team were endorsed by the Digital Transformation Agency for our work.

Design thinking and extensive user research resulted in a functioning, responsive prototype of the online Census. The prototype is still being used by the ABS to support further testing on additional parts of the service.

Our research, recommendations, design approach and prototype also formed a 95+ page Alpha Findings Report, the first draft of a service blueprint and a user story map featuring over 126 stories. In addition to new ways of working, these artefacts provided the ABS with a roadmap to move forward to Beta.


Next project →

Next project →