About the design of the Design Policy Beacon
The Design Policy Beacon has been designed mainly looking at the need to communicate the multifaceted nature of the European design policy ecosystem and to provide evidences and insights to support policymakers in their practice. In the following article, we share our experience in doing this, by describing the main challenges we have faced in terms of reducing complexity, and by unveiling the rationale that has led us to the current outcome. We focus our description around three key drivers for the development of the platform:
- The need of designing and providing diverse formats to narrate the heterogeneous nature of data found about design innovation policies and ecosystems;
- The provision of these formats to policy makers, so to make them capable of exploring their own ecosystem;
- The importance of finding the right tools to visualise data, that could help readers easily access research results.
1. Heterogeneous data require heterogeneous formats
The “Landscape section” is the Design Policy Beacon’s main page. Here users can access all information collected following a geographical nexus and by clicking markers on the map. These last are differentiated by colours to represent one of the three publication formats (Country Analysis, Country Profile, Expert Interview – all described in the introduction of the Design Policy Beacon).
It is worth noticing that the first two formats (Country Analysis and Country Profile) both offer a broad analysis of national design policy ecosystems, each with different extent, while the latter one (Expert Interview) provides the specific point of view of an expert. The reason for using different publication formats is to better respond to the possible needs of users, but also to the heterogeneous data collected. For example, even national statistical data about design are seldom homogeneous, organized in a structured form or simply derived from the same source, as it was noted also in past research projects — such as The International Design Scoreboard, developed by the University of Cambridge:
“Few nations actively collect design-related data as part of their national statistics. In most nations, design itself tends to fall between different government bodies. Some aspects of design are encompassed in government departments related to culture, media and the arts. Other aspects of design fall under the department responsible for industry, technology or innovation. In either case, specific statistics on design are rarely collected, and when they are, they are not collected with clear definitional precision. The reason for this is self-evident, as most broad definitions of design span the entire spectrum from the creation of new technology through to the generation of individual works using craft skills.”(Moultrie & Livesey, 2009 : 17)
Therefore, a truthful picture of European design policy ecosystem can’t be established solely from these data, but should be built by merging information from different sources, including:
- quantitative data (e.g.: national statistics, data on design sector);
- qualitative data (e.g.: descriptive articles, unstructured interviews)
- desk research and case study
In particular, the inclusion of qualitative data has also implied a close engagement with local experts of design innovation from all over Europe (i.e.: practitioners, researchers): Design for Europe’s network has been instrumental in this sense, as it has given to this research the possibility to access and provide a privileged perspective on local design ecosystems.
2. Policymakers can use data to explore their design innovation ecosystem
Data coming from official sources can be extremely valuable for the work of policy makers, for example supporting them in the deeper understanding of public problems. However, official statistics often do not seem to respond to these needs and could be made more accessible and useful if the possibility to interact with data was provided:
“The interactivity of the web also opens up new possibilities. It empowers users to find the information they are looking for in complex, multi-dimensional datasets, and to explore questions in ways that the people who originally collected and analysed the data had not intended. Interactivity also helps to visualise complex innovation datasets in ways that are easier to understand for non-technical users.” (NESTA 2016)
Building on this idea, we have translated all cases mapped (i.e.: policy actions and organizations supporting design) in a machine-readable dataset. This can be accessed in a section called “Catalogue”. When users click on items a panel appears, dynamically displaying data and information about that specific action. User can also explore this catalogue through quantitative criteria (e.g.: year, country, beneficiary) or through the categories defined through our analytical framework. In this way policymakers can learn from past experiences and research according to their needs.
3. The right visualization to communicate your analysis
The use of info-visualisation and data visualisation is growing in many areas, including policy, mainly because “visualisation can improve understanding and has the potential to increase the use of research evidence” (Gatto, 2015: 8). This principle has guided also the choices made to use practically the Design Policy Beacon’s framework (already explained in a previous article (> link to article “Explaining the Design Policy Beacon's Framework - part 1”)), leading us to look for visual representations capable of showing at a glance the structure of a national ecosystem for design support, while at the same time allowing an easy comparison between two or more analysis. Inspired by the sunburst graph, in which “items in a hierarchy are laid out radially, with the top of the hierarchy at the centre and deeper levels farther away from the centre” (Stasko, 2017[FL1] ), the visual appearance ultimately adopted recalls conceptually the principle that a complete system should support design adequately in all three macro areas (framework, human, and asset development).
A constantly evolving process
Due to the risk of rapid obsolescence in the data presented and the continuous search for improvements, the development of the Design Policy Beacon is a constant work in progress. Some challenges have been overcome and some other will follow as the project evolves. We hope that this overview of the design solutions adopted can help others to achieve similar goals of complexity reduction and sense-making.
In the meanwhile, we’d like to hear your feedback at: email@example.com.
Gatto, M.A. (2015). Making Research Useful: Current Challenges and Good Practices in Data Visualisation. Reuters Institute for the Study of Journalism with the support of the University of Oxford's ESRC Impact Acceleration Account in partnership with Nesta and the Alliance for Useful Evidence. Available at: https://reutersinstitute politics. ox. ac. uk/publication/making-research-useful (accessed March 2016).
Moultrie, J., & Livesey, F. (2009). International design scoreboard: initial indicators of international design capabilities. IFM Management Technology Policy, University of Cambridge. Design Council. Reino Unido.
NESTA (2016). Innovation Analytics, a guide to new data and measurement in innovation policy, accessed on 28th March 2017, available at http://www.nesta.org.uk/sites/default/files/innovation_analytics_report.pdf
Stasko, J. (2017). SunBurst. Available at: http://www.cc.gatech.edu/gvu/ii/sunburst/ (accessed March 2016)
This article is the third in the Design Policy Beacon series.
See the other articles: