White Papers


Background

These white papers formed the basis for discussion at the meeting in Exeter. Each white paper was prepared in a matter of a few weeks by a small set of experts who were pre-defined by the International Organising Committee to represent a broad range of expert backgrounds and perspectives. We are very grateful to these authors for giving their time so willingly to this task at such short notice. They are not intended to constitute publication quality pieces – a process that would naturally take somewhat longer to achieve.

The white papers were written to raise the big ticket items that require further consideration for the successful implementation of a holistic project that encompasses all aspects from data recovery through analysis and delivery to end users.They provided a framework for undertaking the breakout and plenary discussions at the workshop.The IOC felt strongly that starting from a blank sheet of paper would not be conducive to agreement in a relatively short meeting.

It is important to stress that the white papers were very definitely not meant to be interpreted as providing a definitive plan.

There were two stages of review that informed the finally agreed meeting outcome:

1.       The white papers were made publicly available for a comment period through a moderated blog.
2.       At the meeting the approx. 75 experts in attendance discussed and finessed plans both in breakout groups and in plenary. Stringent efforts were made to ensure that public comments are taken into account to the extent possible.

The white paper discussion blog is at http://surfacetemperatures.blogspot.com/ and comments remained open until 1st September. Below the white papers are numerically indexed by their slot on the agenda.

Day 1 - issues pertaining to the creation and maintenance of a databank

3.  Retrieval of historical data
4.  Near real-time updates
5.  Data policy
6.  Data provenance, version control, configuration management

Day 2 - analysis, creation of datasets, and performance assessment

8.  Creation of quality controlled homogenised datasets from the databank
9.  Benchmarking homogenisation algorithm performance against test cases
10.  Dataset algorithm performance assessment based upon all efforts
11.  Spatial and temporal interpolation and supplement

Day 3 - publication, presentation and outreach

13.  Publication, collation of results, presentation of audit trails
14.  Solicitation of input from the community at large including non-climate fields and discussion of web presence
15.  Governance
16.  Interactions with other activities

Ċ
Peter Thorne,
Aug 2, 2010, 6:30 AM
Ċ
Peter Thorne,
Aug 27, 2010, 5:16 AM
Ċ
Peter Thorne,
Jul 30, 2010, 7:44 AM
Ċ
Peter Thorne,
Jul 29, 2010, 7:03 AM
Ċ
Peter Thorne,
Jul 26, 2010, 8:36 AM
Ċ
Peter Thorne,
Aug 10, 2010, 6:12 AM
Ċ
Peter Thorne,
Jul 28, 2010, 7:51 AM
Ċ
Peter Thorne,
Jul 30, 2010, 7:44 AM
Ċ
Peter Thorne,
Jul 27, 2010, 9:52 AM
Ċ
Peter Thorne,
Jul 27, 2010, 12:38 PM
Ċ
Peter Thorne,
Jul 26, 2010, 6:35 AM
Ċ
Peter Thorne,
Jul 26, 2010, 11:14 AM
Ċ
Peter Thorne,
Jul 26, 2010, 7:29 AM
Ċ
Peter Thorne,
Jul 27, 2010, 12:38 PM
Comments