General requirements for EuroGOOS
Context of general requirements in EuroGOOS
Complete EuroGOOS report available at: https://envriplus.manageprojects.com/projects/requirements/notebooks/470/pages/57
EuroGOOS (European Global Ocean Observing System ) is an international not-for-profit organisation. It promotes operational oceanography, i.e. the real time use of oceanographic information, and develops strategies, priorities and standards which would enable its evolution at European level.
EuroGOOS is not an RI per se, but it has many members (40 institutes from 19 countries) who contribute part of an RI for ocean observing. It also depends on many RIs from across Europe: at the moment, their community includes autonomous gliders of the ocean, tide gauges, fixed platforms in the sea, FerryBoxes, moorings, high frequency radars and the ARGO float system . They all have their own methods for pre and post processing of ocean forecast information to turn it into products and services for end users. Some of the member institutes participate in the Copernicus Marine Environmental Monitoring Service , a European initiative aiming to standardize how products and services are produced, especially in the forecasting side of things.
EuroGOOS strives to improve the coordination between their different member research institutes. Another important role of EuroGOOS is that of facilitating access to data to the community.
As a basic use-case for the EuroGOOS community, the RI representative has provided the example of a forecast for harmful algal blooms (red tides) which could apply to any of their member research institutes. Data on circulation, temperature, salinity, but also plankton species, is collected directly from the ocean, and combined with information gathered by satellites. It is then merged with a forecast (e.g. a classification of the shellfish or plankton, or some other chemical data or toxin characteristic) to decide, in case harmful plankton is identified within in a particular area, where this plankton will move to within the following days, and whether it will have a negative impact on shellfish, finfish, their harvesting sites or farms, and ultimately on human health.
Data is obtained from the ocean by the use of sensors that are physically in the sea and connected to some form of acquisition system, and a telemetry system that sends the information back to the user ashore. Satellite information comes through a receiving station, either from the satellite producers themselves, from the European Space Agency , or from NASA  in some cases. Forecast data comes from national monitoring programmes.
While the data coming from the satellites is normally quality assured by the satellite providers, the rest of the merged data needs to be curated by a team of experts. They may use some algorithms to change its format or to derive some additional parameters than those that it already has available, they may analyse it and do a trend analysis of the toxins and the plankton, and they may need to interpret some of the model information to draw conclusions on the movement of the plankton and its effects.
The data is usually collected, catalogued and quality assured within the member institutes, which act as national oceanographic data centres. They typically make it available through web access portals and discovery tools to end users, in a format that corresponds to their needs. For the algal bloom use case, for example, people harvesting shellfish could only be interested in the experts’ conclusion about whether it is safe to carry out their business in the following period, while a large consortium company could be interested in more long term data and scientific explanations, to use within their applications. Research institutes also often share data and information amongst themselves (e.g. to warn about a coming storm).
The main stakeholders involved in a use case as the one described above are therefore the research institutes, the satellite providers, the teams of experts and end users, who may vary from commercial to other research institutes.
EuroGOOS collaborates with several RIs for ocean observing from across Europe, and some of their members have set up EuroGOOS Task Teams. In the following sections, details of the operation of the following Task Teams, and the communities to which they belong, is provided:
- The HF Radar Task Team and Community
- The Tide Gauges Task Team and Community
- The FerryBox Community
Summary of EuroGOOS general requirements
The RI representative for EuroGOOS considered that the following are some important aspects that need to be considered by the ocean observing community:
- Integration and coherence;
- The promotion of the community such that it can become more established and attract funders; EuroGOOS is working towards improving the coordination between its different member research institutes, by focusing on different platforms and different topic areas through its working groups and task teams.
- Bringing in more staff who can promote their system.
A big open problem for the operational oceanography community, pertinent to handling and exploiting data, is that four dimensional variational data assimilation (4DVar), which can greatly improve the quality of forecasts, is computationally very expensive (it can double or triple computational overheads). Many of the EuroGOOS member institutes have embarked on a role to do 4DVar, but found that they lack the HPC (high performance computing), computational power or resources. The RI representative called for improvements and enhancements in the HPC or coding environment that would make 4DVar more feasible.
The different member research institutes of EuroGOOS often use HPC for processing data. The scripting languages most frequently used are Matlab, Fortran, Python and IDL. Recently, there has been a movement from proprietary software to more open source solutions, and to getting Fortran or similar kinds of tools to do more of the processing. Most importantly, the RI representative considered that there is a real trend towards using the free scripting language Python. However, a lot of legacy code in Fortran cannot be changed to Python.
The processing of the data involves a lot of parallelization of the code. Moreover, there is a big move amongst member research institutes to push big computational tasks to big machines (e.g. the Irish Marine Institute  runs their data in EPCC  in Edinburgh), and only do pre and post processing on site.
The operational oceanographic community has a strong commitment to free and open data access, and therefore the great majority of the data collected by the members of EuroGOOS is free, both at national and European level. Conditions on the use of such free data by research institutes and end users is, as usual, governed by exchange and copyright agreements (e.g. the data stores needing to be acknowledged, one of the scientists being included as a co-author in publications). There are some restrictions on the sharing of data in some institutions, in case the scientists want to ensure that it is sufficiently high quality before it is released, or often in the case of chemical or biological data which may be used for specific purposes by commercial companies.
EuroGOOS brings to ENVRIplus expertise in the marine domain, and an understanding of end users and customers. It does not have its own ships or platforms for HPC, but all of its member institutes do. It is committed to helping ENVRIplus define use cases and develop tools.
EuroGOOS does not handle data per se, but facilitates data access between its member institutes. One of its working groups is the Data Management, Exchange and Quality Working Group (DATA-MEQ ). Its role is that of advising on data management, but also on the end to end gathering of oceanographic data, the quality assurance of the data, the standards that should be applied for data collection, processing, and quality assurance, and the ways in which data should be made available to end users. The working group is advanced and mature.
In what concerns used standards, The DATA-MEQ working group use the ISO standards for metadata, and also other standards for the data itself. There has been a considerable amount of work at European level between different project initiatives to establish data standards and common definitions across the marine community.
In what concerns the software used, this varies between the member research institutes, but at European level there is a commitment to making sure that the underlying standards are the same even if the used software is different.
Topics 3 (Cataloguing), 4 (Processing) and 6 (Optimization) mostly cross-link with the EuroGOOS data management plan. The ocean observing community puts a lot of effort into developing standards for these topics. As data volumes are growing, optimization has recently become very important.
EuroGOOS do not have constraints on handling data, because they do not perform data handling per se. Data handling is performed, and distributed across, their different member research institutes and European initiatives such as EMODNet  and Copernicus. Each research institute has its own data handling procedures. EuroGOOS only facilitates access to data and frees up access to datasets which have proved difficult to get access to in the past, such that they become more openly available to the community.
EuroGOOS do not have set procedures for security and access. Their work is more on the political rather than the technical side. In particular, when a dataset which is of great potential value to the community is unavailable, they discuss with the people involved what can be done and how they can help to make it more freely available. They have also made efforts towards making the access to the data more streamlined and straightforward to the end users. For example, they have worked together with the data providers to reduce the effort required from end users in terms of the need to register and fill in many web forms until they can have access to the data. Sometimes, metrics about the end users can be obtained directly from their IP addresses.
In what concerns the open access policies of the used software and computational environments, this varies between the different member institutes of EuroGOOS. There is no clear trend in using open source, or paid for, software or computational environments. Many of the research institutes use Matlab and the Microsoft suite of tools (the latter especially for the basic quality assurance of biological or smaller datasets), which require a licence, some also use the free scripting language Python or open source tools (e.g. Ocean Data Viewer ), and others (e.g. the Irish Marine Institutes) have their own application development teams which develop bespoke tools for their purposes only.
A big open problem for the operational oceanography community, pertinent to handling and exploiting data, is that four dimensional variational data assimilation (4DVar), which can greatly improve the quality of forecasts, is computationally very expensive (it can double or triple computational overheads). Many of the EuroGOOS member institutes have embarked on a role to do 4DVar, but found that they lack the HPC, computational power or resources. The RI representative called for improvements and enhancements in the HPC or coding environment that would make 4DVar more feasible.
1. The HF Radar Task Team and Community 
The HF Radar Task Team of EuroGOOS  is a working group on HF Radars in Europe. Its main purpose is that of encouraging networking activities around HF Radar applications in Europe. Since the beginning of September, many participants of the task team have started the JERICO-NEXT  project, the second part of JERICO . Within the HF Radar community, this project will contribute to the homogenisation of workflows, standardisation and better data management.
HF Radar is a key technology for ocean monitoring. It gives a way to monitor surface current and waves, with a good high spatial and temporal resolution. It can be used for search and rescue, for being able to perform analysis and short term prediction of transport at sea surface. It is also used for monitoring the dispersion of pollutants, for example during oil spill accidents. Some institutes or countries also use this information for scientific purposes, to better understand hydrodynamics in a specific region.
High frequency (HF) radar systems measure the speed and direction of ocean surface currents in near real time. At least 2 stations are set up in the coast, and each of them provides radial data. This data is collected in data centres and analysed by operators. For some users such as modellers, these radial outputs can be used directly, together with an indication of the error in the data, for example for modelling assessment or data assimilation. For the majority of users, radial data from at least two stations is combined to obtain total data in the form of 2D vectors of surface currents. A map of vectors of surface currents in a specific area is obtained. The total area depends on the coverage of each antenna. Depending on the working frequency, antennas can have a higher resolution in a smaller area, or a lower resolution in a wider area. The map of vectors is typically provided in a NetCDF file with parameters depending on requirements: frequently a U and V component, and sometimes magnitude and direction. The file typically contains data over a specific period (hourly file, monthly file, etc.). Additionally, the map of vectors is usually joined by an estimate of the error of the data, which is useful for data assimilation in hydrodynamic models.
There is sometimes some coordination at national level between different data centres (e.g. Puertos del Estado in Spain ), but usually this is not well organised. In collaboration with the EMODnet Physics group, the HF Radar Task Team has successfully managed to connect 4 HF Radar systems in Europe in a standardised way to the EMODnet Physics portal . They are currently working on connecting a 5th from Malta, and during future phases of EMODnet Physics they would like to add even more systems, and encourage the operators to be connected.
Other than through the EMODnet physics portal for the 4 connected systems, results may be provided to users through different means, such as through a THREDDS Data server  or through an FTP with some ASCII data. Some operators do not currently provide raw data because they have not put it in an adequate platform yet, others do not want to provide the raw data. The main operators who do provide it, though, are supporters of an open access philosophy, and generally use online open access portals. The EMODnet Physics initiative aims to help operators provide their data to a wider community.
The users of the data can be scientists, modellers, public agencies which may integrate the data in search and rescue or forecasting tools, or companies which may need forecasts in case of accidents. The users may have certain requests (e.g. not to have any gaps in the vectors if they need complete fields for their tools), and they usually want proof of the quality of the data. Operators have some data quality in place, but the standards that they use for it are not homogeneous in Europe.
The HF Radar Task Team has many interactions with other coastal platforms. For example, it collaborates with the other Task Teams of EuroGOOS, and plans joint actions within the JERICO-NEXT project.
The HF Radar community can share with ENVRIplus or other RIs:
- historical time series of surface currents, and waves in some cases. This data is generally free, but this depends on each operator and country.
- users/expertise to provide advice on various topics
- access to related scholarly publications
- access to grey literature; They will produce reports to disseminate their activity as part of EuroGOOS and its working groups.
The HF Radar community cannot share software, because have not decided on using a common specific software yet. For data processing, operators could use the software of the HF radar provider (CODAR , WERA , etc) or their own software. Some of the partners use open software which could be shared (e.g. some Matlab toolboxes for working on HF Radar data have been developed by an American company and they are open to all HF Radar communities), while others use licensed software. As part of the homogenisation work proposed during the JERICO-NEXT project, the task team would like to decide on common tools.
The HF Radar community cannot share computational resources, because currently the different operators are providing their own data and there is no common computational infrastructure. Moreover, the analysis required for HF Radar technology does not have very large computational requirements as compared with that for other technologies. There are discussions as part of the EMODnet Physics context about building a node where all of the data from all the stations connected to this network would be available, but it is not clear yet whether EMODnet would assume all responsibility for it.
The HF Radar community do not intend to give access to their infrastructure, but only to their data, which as part of the JERICO-NEXT framework is called “virtual access”. This is because the operators strive to maintain the configured HF Radar systems stable in real time and continuously, and this would be difficult to manage if someone else is allowed to tune them. Although a project in Norway has attempted to demonstrate that HF Radar technology could be quickly deployed on request by different operators in case of accidents, there were no official attempts of sharing this technology.
Through participation to ENVRIplus, the HF Radar Task Team would value:
- learning about other European RIs and getting inspiration from them for deciding on the general objectives and services that they could provide at European level; The HF Radar Task Team are only at the beginning of the roadmap, but they will soon need help in organising a European node for providing HF Radar data on a homogeneous way to European users.
- from a technological perspective, getting recommendations about the design of their common data system, including formats or data platforms and data treatments.
- getting inspiration from RIs about ways to distribute the data to end users using applications which are more focused on this aspect.
Within the JERICO-NEXT context, the HF Radar Task Team will conduct homogenisation and standardisation work. A particular consideration is that the main European operators use systems from one of two HF Radar technology manufacturers: CODAR (an American company) and WERA (a German one), the data formats of which are different. Also, operators currently perform different types of data analysis depending on the system. Through participation to JERICO-NEXT, the HF Radio community will attempt to homogenise the formats and data analysis methodologies amongst operators. The task team will strive to share the benefits from this project with a wider community, and involve a wider community in the definition of standards.
The HF Radar community are generally trying to use standards which are also used by other communities whose data collection and analysis are similar. They are currently using the conventional standards on ocean variables, such as INSPIRE, and both general and specific best practices and recommendations for sharing ocean data, such as those provided by the EuroGOOS reports. The weakness of these standards is the difficulty to find a compromise between different needs from different end user requirements. The HF Radar Task Team is working on further developing them.
Regarding software, what is important is to make it clear to the end users what kind of analysis has been made on the data, as they are very interested to know whether the data is raw or, if not, what filters or interpolations have been applied to it. The HF Radar Task Team would like to develop a homogeneous methodology for producing data of a particular quality. Their objective is also to promote the sharing of open source software, although they do not wish to impose specific software to all of the operators.
The HF Radar Task Team would like to change working practices as part of their homogenisation work together with JERICO-NEXT.
During their first year of operation, the HF Radar Task Team have focused on making their initiative visible, finding funds to pay for their networking activities and identifying projects which can help them attain their objectives. They are currently trying to attract more people in providing their inputs, connecting their system, participating to these homogenisation activities, defining and applying standards. The RI representative strongly believes that their homogenisation activities will help him and the HF Radar community work better and faster. It is for this reason that many actors have agreed to join their networking activities.
The HF Radar Task Team do not have a finalised data management plan yet, but do have a plan and some initial ideas, and all of the ENVRIplus topics cross link with them.
The HF Radar community have cost constraints for data handling and exploitation, because the availability of HF Radar data is dependent on the maintenance of the systems and the development of the networks. Capital costs are needed to extend the observing network. Maintenance costs are needed to make the network sustainable. Operational costs are used to ensure the quality of data. R&D funds are needed to optimize the products and make them totally transferable. Operational costs for data management are relatively low.
The community do not have security or privacy constraints, because the data is generally open, as this is the main objective for operational oceanographic data, with a few exceptions. If the data is acquired for scientific purposes, the scientists may not want to release it until their results are published. When the operators are the end-users, in some cases of data for search and rescue, the need to share the data openly is also not so obvious for them. The Task Team may ask operators for collaboration, for improving the quality of the data from the provider point of view.
The community do not have an overall approach to security and access, because the data (current and wave) is usually not sensitive.
In what concerns the access for scrutiny and public review, the HF Radar Task Team may only require operators to collaborate in making visible the source of the data.
2. The Tide Gauge Task Team and Community 
The EuroGOOS Tide Gauge Task Team , established in 2015 (Terms of Reference approved in May during the EuroGOOS annual meeting), has the main goal of looking in detail at the weaknesses of the existing network of tide gauges in Europe and proposing steps forward to strengthen coordination and the homogenous exploitation of the data. It will provide detailed answers to these questions in the following months. They have currently started producing deliverables and making plans for action.
The basic purpose of tide gauges is that of monitoring sea level changes along the coast for all frequency ranges, following tsunamis or seiches to tides, measuring storm surges and long-term mean sea level evolution and its relation to climate change. Sea level data provided by tide gauges is used for a diverse range of applications, both in the area of operational oceanography and in other areas, such as: tidal computation and forecast, sounding reduction, aid to navigation, harbour operation and dredging activities, studies of long-term mean sea level changes and extremes and their relation to climate change, sea level hazard warning systems (for storm surges, tsunamis), oceanographic model validation, satellite (altimeter) data calibration, datum definitions (computation of chart datum and national datum) and hydrography.
The tide gauge community is responsible for sea level data at coastal points, usually harbours, as measured by tide gauges, e.g. related to a fixed benchmark on land. Time stamps and datum definitions (i.e. references of the measurements) are critical additional information of these measurments. The EuroGOOS Tide Gauge Task Team aims to improve data exchange and collaboration of tide gauge operators of the European, Mediterranean and adjacent sea country coastlines.
Europe has a large and diverse number of tide gauge networks, and they may operate slightly differently. However, common basic requirements established a long time ago within GLOSS (Global Sea Level Observing System ) are usually met. Most of the tide gauges, located at a harbour quay, consist nowadays of a radar or acoustic sensor located above the water surface that derives sea level from the measurement of the distance of the sensor to the water. However, other technologies such as the traditional float gauges or pressure sensors (located well below the low tide, in the water) are employed. Generally, national centres receive data from the sensors via the internet, by email, ftp or HTTP. Traditionally, average sea level data was provided with sampling ranging from 6 or 10 minutes. Since the tsunami in the Indian Ocean in 2004, most of the countries have started to provide one minute average sea level data or even less for the purposes of tsunami warning and the early detection of high frequency oscillations. The format of the data which is sent by the station to the national centres is diverse between countries. In Puertos del Estado, for example, they use a specific XML format named ESEO XML, which was defined years ago within a national Spanish project involving institutions working with oceanography data. Other countries may just use ASCII files. National centres usually display the data in national data portals. From the national centres, the data is then re-distributed, depending on the application, to regional, European or global portals using the Net-CDF format as a common standard in those portals devoted to operational oceanography. From Puertos del Estado, for example, data is distributed to the different ROOSs of EuroGOOS which they belong to (the IBIROOS  in-situ tag and the MONGOOS in-situ tag ), to the IOC Sea Level Station Monitoring Facility related to the GLOSS , which compiles data on sea level across the Globe, and to the Permanent Service for Mean Sea Level . Moreover, they send ASCII one minute files to the National Tsunami Working Centre in Spain and to regional tsunami working centres in France. Each of these portals has different requirements on latency and sampling, so the national centres adapt the data for each required application. Transmission latency, data quality control and processing depend on the application. Most of data quality control and processing are done by the national centres, and the products (real or near-real time data, tidal constants, extremes, mean sea levels, etc.) disseminated to other data portals and to the different users.
The users of the data are mainly universities and private companies doing particular work for harbour construction, harbour design, coastal infrastructure and navigation. They are assumed to be responsible for the use that they make of the data.
The tide gauge community can share with ENVRIplus datasets, which are in principle freely available from most of the countries in Europe, through the different above mentioned data portals. Some countries in the Mediterranean Sea, however, still have a restrictive data policy and are not sharing their data with the international community. It can also share users/expertise to provide advice on various topics, access to related scholarly publications and to grey literature, usually available on national or regional websites.
Through participantion to ENVRIplus, the Tide Gauge Task Team would like to achieve the same objectives as for EuroGOOS:
- The most important is improving the communication and collaboration between tide gauge operators. In the past, a sea level community has been working together to get funded projects, but this community has been mostly disconnected for the last 10 years, and this is one of the main calls of the Task Team.
- They would appreciate help in reviewing what each national centre is doing concerning quality control and data processing, considering that they are starting to also be performed through regional data portals.
- The Task Team would like to be supported in the continuous testing of the different types of tide gauges. Although this is an old type of measurement (measurements have started in the 19th century) and recent technologies have appeared with many advantages with respect to the old ones, there are still open questions and problems to solve as for example the effect of waves in the measurement. There are also issues with the calibration of equipment to achieve an accuracy within 1 mm/year for mean sea levels, the levelling procedures and the measurement of the vertical movement of the land in order to know absolute sea level variations.
In what concerns standards, each national centre decides which standards to use. Standards on quality control and data processing are available from different international programs and European initiatives such as IHO , GLOSS, NEAMTWS  and MyOcean  projects. For example, Puertos del Estado follows the standards for quality control and data processing that were developed within GLOSS, NEAMTWS and MyOcean, to which they have added some new developments for real time quality control and the detection of high frequency sea level oscillations (e.g. tsunamis). It has also contributed to the new GLOSS standards, in particular the GLOSS manual of sea level processing, and to the standards for IBIROOS (the sea level quality control standards now used in IBIROOS have been proposed by them in collaboration with BODC-UK  and DMI - Denmark ) and other ROOSs in general. The Tide Gauge Task Team will check whether the GLOSS and other European standards are followed everywhere. It may recommend new standards and working practices for improving the current system in Europe, if the increasing use of tide gauge data in real time and the new need for sea level related products demand new developments.
In what concerns software, each organization has its own software and computational environments for collecting, processing and storing data, and there is no de facto standard software. However, the main steps are basically the same, and general recommendations on quality control and sea level data processing, provided by, for example, GLOSS, EuroGOOS and MyOcean projects, are followed. Most solutions are developed by organizations on their own, but in some cases some steps of the data processing chain make use of old, known and internationally adopted software, such as the Foreman tidal analysis and prediction package. In the case of Puertos del Estado, most of their software is old, based on Fortran and C Shell scripts developed by them, and they have been adding elements to it for new applications. They are also using some software which was taken from other organisations. They do not have a complete tool in a unique language, and what they have cannot be easily shared with other organisations. Some work would be needed in this sense. The Tide Gauge Task Team will check whether the national operators use similar software tools, or would need tools from the community.
The following are some of the main issues with which the Tide Gauge Task team is confronted nowadays:
- Encouraging member organizations to make their data available
- Promoting the perennity of existing tide gauge stations in Europe as regards to budget constrains
- Establishing new tide gauges with real time sea level data transmission in the North of Africa
The Tide Gauge Task Team is still to establish a data management plan.
In what concerns non-functional requirements, the cost of a tide gauge station is usually not an issue. Maintenance and operational costs are important constrains, especially for networks with wide coverage and/or remote stations. Tide gauges require frequent on-site calibrations in order to meet standard quality requirements. Operational use (e.g. tsunami warning) requires robust and redundant communication systems. Security, privacy, access for scrutiny and public review, and the computational environment on which the software runs are not an issue for the tide gauge network.
The situation with regards to the access to data, software and computational environments is not the same in every organization, and will be clarified by the Tide Gauges Task Team.
The following are some big open problems for the tide gauge community:
- Real time data availability, especially in the Mediterranean Sea
- Real time quality check and warning
- Vertical land motion monitoring
Generic questions (for all topics)
- 1. What is the basic purpose of your RI, technically speaking?
to obtain on-line in situ oceanographic and environmental data from ships-of-opportunity
- a. Could you describe a basic use-case involving interaction with the RI?
Phytoplankton related data such as chlorophyll-a, or nutrients deliver insight in the trophic state of the sea (problem area: eutrophication, harmful algal blooms, oxygen concentrations)
- b. Could you describe how data is acquired, curated and made available to users?
Data are collected through a PC on board of the ship, are available for direct inspection at the institute, the data are quality checked and put into a Ferrybox database which is Oracle based. This database is containing data from all FerryBoxes installed in Europe. The data are made available through a website to users.
- c. Could you describe the software and computational environments involved
Commonly used are MS-Windows based computers with LabView software for controlling the system and for data acquisition. Communication (remote control and data transfer) to the shore station via mobile phone or satellite connection. The data are stored into an Oracle database which runs on Linux.
- d. What are the responsibilities of the users who are involved in this use case
Publication of results, involvement of specific end users and policy makers
- 2. What datasets are available for sharing with other RIs as part of ENVRIplus? Under what conditions are they available?
All data sets offered from the websites in the different ROOSs (e.g. NOOS, BOOS…) for sharing, conditions are that source of data is mentioned
- 3. Apart from datasets, does your RI also bring to ENVRIplus:
- Software? In this case, is it open source?
- Computing resources (for running datasets through your software or software on your datasets)?
Oracle based database on Linux with open access.
- Access to instrumentation/detectors or lab equipment? If so, what are the open-access conditions?
Access is possible through e.g. EU projects such as JERICO;
- Are there any bilateral agreements?
- Users/expertise to provide advice on various topics?
yes on all items linked to the use of FerryBox systems through annual meetings and publications
- Access to related scholarly publications?
yes (e.g. Petersen, W. (2014). "FerryBox systems: State-of-the-art in Europe and future development". Journal of Marine Systems. DOI: 10.1016/j.jmarsys.2014.07.003)
- Access to related grey literature (e.g. technical reports)?
yes e.g. final report of EU FerryBox project
- 4. What plans does your RI already have for data, its management and exploitation?
make them freely available, further analysis in scientific papers
- a. Are you using any particular standard(s)?
most measurements are standardized and checked through lab analyses and calibration procedures;
- Strengths and weaknesses:
difficulty to standardize chlorophyll measurements with on board fluorimeters
- b. Are you using any particular software(s)?
Commonly used MatLab and R for data evaluation. For database related tasks PL/SQL is used
- Strengths and weaknesses:
- c. Are you considering changing the current:
- i. standard(s)
- ii. software
- iii. working practices
as part of a future plan? Please provide documentation/links for all the above which apply.
- 5. What part of your RI needs to be improved in order:
- For the RI to achieve its operational goals?
No need for improvements yet
- For you to be able to do your work?
Long term financial support
- 6. Do topics [1-6] cross-link with your data management plan?
Yes I think so
If so please provide the documentation/links:
List of publications related to the use of FerryBox systems (www.ferrybox.org)
- 7. Does your RI have non-functional constraints for data handling and exploitation? For example:
- Capital costs
- Maintenance costs :
Maintenance costs are mainly financed by research money
- Operational costs:
are based on our research funding
not within European Seas but e.g Russian area in East Baltic Sea
- Computational environment in which your software runs
- Access for scrutiny and public review:
through international advisory board
If so please provide the documentation/links
- 8. Do you have an overall approach to security and access?
- 9. Are your data, software and computational environment subject to an open-access policy?
- 10. What are the big open problems for your RI pertinent to handling and exploiting your data?
Availability of data from other partners and common QC procedures
Formalities (who & when)
|Cristina Adriana Alexandru|
|Glenn Nolan (EUROGOOS), Julien Mader (HF Tadar Task Team, working in Asti in Spain), Begoña Pérez Gómez (Tide Gauge Task Team, working in Puertos del Estado in Spain), Franciscus Colijn, Willy Petersen, G. Breitbach (FerryBox Task Team)|
Period of requirements collection
|July - December 2015|