Superior strategies are based on superior data
We provide shipping companies with high quality data for their requirements in the areas of SAFETY, COMPLIANCE, PLANNING and OPTIMIZATION
Comprehensive fleet analyses and optimisations are only possible with a comprehensive digitalisation of your ship systems. Hoppe collects and provides you with your data, also comprehensively validated if desired; anytime and anywhere

Information Services
Fleet Connect
The validation of a large number of the ship’s operational and nautical measurement values onshore for measurement optimization – this is the key demand of today’s maritime data acquisition.
The major challenge for vessel owners and operators to verify and validate the large amount of data after receiving them on shore still increases. Unclean data can significantly lower the potential for optimization and might even lead to disadvantages in vessel operation.
The Data Supplier
Data Butler
Provision of excellent ship data
The DATA BUTLER is a cloud based service to provide ship/fleet data to the customer via standard interface for further processing. Different service levels are available, from high resolution data provision on demand up to full data quality information.
Key Features
- Provision of ship data with easy access
- Full system responsibility
- Unlimited cloud based data storage – with 24/7 access
- Focus on cyber security – encrypted ship-to-shore data transfer
- Remote service, updates and costumer specific configuration
The Data Verifier
Data Inspector
Ensure Data Quality
The DATA INSPECTOR is an additional service to the DATA BUTLER or available as a stand-alone service. It contains daily data checks, fleet data quality reports and trouble shooting measures and thus provides the data basis for optimized fleet planning and vessel operation.
Key Features
- Ensured data quality as a precondition for compliance, planning and optimization
- Continuous system health check and daily data evaluation by qualified service technicians, analysts and marine engineers
- Fleet data transparency by automatic reports, including system health, quality KPIs
- In-depth evaluation of primary signals to detect implausibilities, drifts and offset
- Fast response – (remote) service attendance reduces downtime to a minimum
Data Butler | DATA INSPECTOR | |||
|---|---|---|---|---|
Transparency | Quality | MAINTENANCE | ANALYZER | |
On board | ||||
Ship data server application incl. configuration | ||||
Webbased data transmission software
| ||||
Completeness check for data transmission
| ||||
Cloud based data hosting | ||||
Backup protection
| ||||
Unlimited data storage volume | ||||
Full access to raw data
| ||||
csv / Excel Download
| ||||
Full access to high-resolution data
| ||||
Flexible configuration of vessel data
| ||||
Data Quality API
| ||||
DAILY CHECKRoutine work is performed every regular working day by a service technician. | ||||
Daily System Health Check
| ||||
Fleet Data quality timeline | ||||
Total Fleet Data quality timeline | ||||
Data check for validity and threshold exceedances
| ||||
Fleet Data Summary | ||||
PROFOUND VALIDATION AND TROUBLESHOOTINGValidation chapter of Data Analysis Catalog will be carried out quarterly. | ||||
Measurement devicesProfound validation of the following devices will be carried out:
| ||||
Downtime AnalysisDowntime analysis for systems and signal. | ||||
Primary signals Correlation matrixCorrelation and time series evaluation of ISO19030 primary signals to detect specific behavior, sensor errors and offsets. | ||||
Troubleshooting AnalysisOne individual troubleshooting per vessel and year according to vessels under contract. Use cases for an evaluation are e.g. Data Gaps, Speed Claims, Groundings, Bunker Claims.
| ||||
Service Level Agreements | ||||
Basic (Tier I)
| ||||
Advanced (Tier II)
| ||||
UPCOMING FEATURES | ||||
Data AnalyzerIntuitive tooling to contextualize data and to make sense of it. Feature used by internal users for troubleshooting | ||||
Operational Event and Notification Service
| ||||
How is the daily data inspection handled?
Daily evaluation of data quality is performed by a team, consisting of marine engineers, naval architects and experienced service technicians. For at least once per working day, the team visualizes and traces data losses, implausibilities and deterioration in quality on the highest aggregated level. Based on sudden events and also on gradual changes, technical clarification and troubleshooting is done. Situations which can be solved internally by analyzing the data with the Hoppe Marine Signal Inspector, Data Analysis Catalog or by remote access, as well as the communication with clients or third party providers in terms of interface troubleshooting, are part of the daily business.
In case conspicuous sensors are detected, further agreements with clients are possible to directly offer corresponding spare parts and service jobs.
What happens when a faulty sensor, an invalid signal or a failed interface is detected?
In case a signal is transmitting invalid data of a normally intact sensor, a check regarding implausible behaviour due to unfavorable environmental or operating conditions is done, to figure out whether the sensor is malfunctioning or not. For example, the alleged offset of a flow meter on a ship during standstill can be of various reasons.
In case conspicuous sensors are detected, further agreements with clients are possible to directly offer corresponding spare parts and service jobs.
How can the data quality be calculated?
The data quality is based on the quality of every single signal. One can distinguish between Primary Signals (data essential for reporting and optimization of operation) and Secondary Signals (further ship operation data for monitoring, planned maintenance, etc.) to output aggregated quality. The quality for each signal is calculated in consideration of invalid and implausible data (limit value exceedance, etc., application of algorithms).
Furthermore, with the Data Analysis Catalog a correlation matrix will be created, including the evaluation of different engineering processes and their relationship on board. The correlation has no direct influence on data quality, since in this case an evaluation for the exact determination of the faulty input variable is performed by a technician.
Can the data quality be monitored by other sensors?
Yes, all system-included sensors and values can be monitored. The detection of failed interfaces and malfunctioning sensors, as well as testing the exceedance of operationally not reachable limit values are part of the scope of performance. Normally, in-depth validation algorithms that assume knowledge about sensor behavior, can only be applied to sensors made by Hoppe Marine.
Are individual evaluations and reports possible?
The package “Operational Analyst” includes complete or customized analysis catalogs that offer in-depth information about the vessel operation. The fact sheet for the Data Analysis Catalog can be found on the hoppe-marine.com website.
Can the choosen data package be changed after initial setup of interfaces between Basic Data and Quality Data?
Yes, the subscription model can be changed and optimized afterwards according to the client’s needs. Beside this, get in contact to learn to know our SDK.
Is the provision of historical data possible?
The APIs provide all data files which are available for the specific vessel. E.g. if data recording with the embeddded iPC HOMIP2 on board started two years before contract was signed, this data can be made available on the shore side API upon customer request. Further more, see „Is it possible to load external or historic data into the data pool via interface?“ in the Ship-to-Shore FAQ.
Is the data also retrievable by Fleet Management systems / Fleet Optimization providers?
In principle, every IT-system that communicates with REST-APIs is capable of retrieving data from the Hoppe Data-Pool. Therefore interfaces to Business Intelligence Tools, like Tableau, but also Elasticsearch or classic ISO-SQL-conformal data base systems via JDBC can be provided. Please contact our Hoppe Marine system administrators for technical details.
How can the information about data quality be retrieved?
Hoppe Marine offers a simple interface for retrieving information about data quality. The interface is described in the developer documentation. Information about data quality of every single data point as well as of entire signal groups can be retrieved via this interface.
Is it possible to export data for further individual work with Excel?
Offering raw data in various formats (e.g. .csv and .xls) is planned as a new feature still in 2020. Today, raw data is offered as optimized SQLite data or in JSON format.
Where can I get information about the APIs?
Hoppe Marine offers a developer portal for clients. Via docs.hoppe-sts.com the portal can be reached worldwide 24/7. After successful registration, the client’s developer have access to all necessary API details and functionalities. Prior to first use a user account must be set up by our Hoppe Marine administrators.
How often new data will be available via API?
This behaviour is fully customer configurable. Data update rates range from two minutes to once per day. This is always a trade-off between used data volume and realtime data requirements. This configuration can be made during the early project planning phase for the initial installation. Changes in this configuration are possible later on. A typical use case with sufficient high resolution for data evaluation and fleet optimization the export logging rate of 1 min and the export interval and transmission is of 5 min.
Is it possible to have access to a sample data set, in order to replicate the live API before the vessels is online?
A representative data sample will be provided upon request. This data set is representative simulated ship data worth one month of time. The available set of signals (database columns) as well as the data aggregation for a specific vessel might differ from the sample data set. All these configurations (data aggregation rate, data export rate and alike) are at the customers disposal and can be configured and adjusted upon customer needs.
Can I get some assistance by Hoppe Marine in terms of setup and configuration, in case I have no experience in retrieving data via APIs?
Hoppe Marine is delighted to assist during the initial setup. For this purpose, an initial consultation meeting will be held to discuss all requirements and to provide possible short-term solutions for an optimized utilization of the Data-Pool-Services. After that, we can decide how Hoppe Marine can assist the client with further experiences. In any case, after the initial consultation meeting the full access to the system and interface documentation is granted.
Will the signals API be versioned? In other words, is the current API version "1" and will any changes to the signals structure and mappings be named and released separately?
Any backwards compatibility breaking changes will be announced and versioned by path versioning. For the moment we have on our development roadmap only schema additions which we take to not be breaking changes but minor releases. These will not be reflected in a version scheme.
We have tried to gain access to the signals API but we are still getting the response: "403 Forbidden" Is this expected? It would be very helpful for us to have the full signal schema so that we can integrate generally with the API.
Could you please check again that you subscribed for the Signals-API in the developer portal and that you provide your personal API key (to be obtained at https://docs.hoppe-sts.com/ dashboard) via the x-api-key header field in your GET request? Please also check that you can retrieve data via the “Try it out” functionality in the developer portal. A sample request in CURL would look like: „curl -X GET „https://api.hoppe-sts.com/signals/collections/hoppe/signals“ -H „accept: application/json“ -H „Authorization: “ -H „x-api-key:“
We have been receiving a 429 Too Many Requests response code when sending multiple requests simultaneously across several vessels. In the short term, we can limit the requests to only registered vessels that are confirmed to have data. However, as more vessels are brought onto our platform, we may run into these limits in a more significant way, especially as we will continue to utilize a parallelized workflow. What is the API specification on the rate limiting threshold and retry periods?
Yes, the APIs are rate limited per default. This is to encourage users to also use local caching technologies on slowly changing data. However, the limits are set per API key and can always be adjusted to the customer’s needs. In order to adjust the limits we would like to ask you to provide us with number about your expected usage pattern. The information we would need is:
- regular requests/second
- max. burst requests/second
- requests per month.
With these numbers at hand we can adjust your account settings accordingly.
Which products are utilised and how do you manage your cloud security?
Amazon S3 and Amazon DynamoDB are used for storage. Data in S3 is encrypted at rest
using the SSE-KMS method described here. The underlying block cipher used is AES-256. For
DynamoDB also, the data is encrypted at rest as described here.
Regarding cloud security management, as there four major components:
- From a user management perspective for our APIs, a fine-grained Role-Based Access Control model
that uses Amazon Cognito with Amazon API Gateway for authentication and authorization is used. - Inside AWS, all our cloud resources use AWS Identity and Access Management (IAM) for
implementing least privilege access. - Hoppe uses security groups where possible to restrict access as well.
- Hoppe uses AWS Secrets Manager for handling secrets internally.
What is the required data storage volume which we can expect on the vessel (is the data stored on the vessel or in the cloud?)
All raw data is kept on the ship as primary data source and is stored safely on the HOMIP2 devices flash memory. With a 64GB flash drive one can expect a log retention of at least 2 years in a standard performance monitoring solution (300 signals logged every minute in average). Before we transmit the data to shore we pre-aggregate the data in order to optimize the data volume with regard to the used transmission volume. Therefore the exported data usually features lower time resolution compared to what would be available on the ship, though this is customer configurable. From this point of view the cloud storage is in most cases a down sampled mirror of the database on board.





