Contact

LET US GET IN TOUCH WITH YOU

*mandatory

Thank you! We will get in touch with you shortly.

Superior strategies are based on superior data

We provide shipping companies with high quality data for their requirements in the areas of SAFETY, COMPLIANCE, PLANNING and OPTIMIZATION

Comprehensive fleet analyses and optimisations are only possible with a comprehensive digitalisation of your ship systems. Hoppe collects and provides you with your data, also comprehensively validated if desired; anytime and anywhere

Information Services

Fleet Connect

The validation of a large number of the ship’s operational and nautical measurement values onshore for measurement optimization – this is the key demand of today’s maritime data acquisition.

The major challenge for vessel owners and operators to verify and validate the large amount of data after receiving them on shore still increases. Unclean data can significantly lower the potential for optimization and might even lead to disadvantages in vessel operation.

The Data Supplier

Data Butler

Provision of excellent ship data

The DATA BUTLER is a cloud based service to provide ship/fleet data to the customer via standard interface for further processing. Different service levels are available, from high resolution data provision on demand up to full data quality information.

Key Features

  • Provision of ship data with easy access
  • Full system responsibility
  • Unlimited cloud based data storage – with 24/7 access
  • Focus on cyber security – encrypted ship-to-shore data transfer
  • Remote service, updates and costumer specific configuration

The Data Verifier

Data Inspector

Ensure Data Quality

The DATA INSPECTOR is an additional service to the DATA BUTLER or available as a stand-alone service. It contains daily data checks, fleet data quality reports and trouble shooting measures and thus provides the data basis for optimized fleet planning and vessel operation.

Key Features

  • Ensured data quality as a precondition for compliance, planning and optimization
  • Continuous system health check and daily data evaluation by qualified service technicians, analysts and marine engineers
  • Fleet data transparency by automatic reports, including system health, quality KPIs
  • In-depth evaluation of primary signals to detect implausibilities, drifts and offset
  • Fast response – (remote) service attendance reduces downtime to a minimum

Chose your perfect fit – our data packages

Requently Asked Questions
Data Butler
DATA INSPECTOR
Transparency

Quality

MAINTENANCE

ANALYZER

On board
Ship data server application incl. configuration
Webbased data transmission software
  • Configurable bandwidth limit
  • Bandwidth detection prevents any transmission, in case the risk of a weak transmission is detected
Completeness check for data transmission
  • SatCom Hardware and Airtime not included.
Cloud based data hosting
Backup protection
  • Data Hosting on EU Server centres with 6 backup copies
Unlimited data storage volume
Full access to raw data
csv / Excel Download
  • Free selectable time range for ad-hoc analysis or troubleshooting.
Full access to high-resolution data
  • High-resolution historic data on demand.
  • On-demand data access to high-resolution data.
Flexible configuration of vessel data
  • Free remote update of data configuration, logging and export configuration
Data Quality API
  • Quality Data via API with quality information for single variables
DAILY CHECK

Routine work is performed every regular working day by a service technician.

Daily System Health Check
  • Check if monitored Hoppe systems are online,
  • Logging device active,
  • Completeness-check for data transmission.
Fleet Data quality timeline
Total Fleet Data quality timeline
Data check for validity and threshold exceedances
  • The availability and validity of all signals acc. to the agreed on signal list will be checked.
  • All signals are checked against physical thresholds for a plausibility check.
Fleet Data Summary
PROFOUND VALIDATION AND TROUBLESHOOTING

Validation chapter of Data Analysis Catalog will be carried out quarterly.

Measurement devices

Profound validation of the following devices will be carried out:

  • Shaft Power Meter,
  • All Flowmeter,
  • Speedlog,
  • Draught Sensors.
Downtime Analysis

Downtime analysis for systems and signal.

Primary signals Correlation matrix

Correlation and time series evaluation of ISO19030 primary signals to detect specific behavior, sensor errors and offsets.

Troubleshooting Analysis

One individual troubleshooting per vessel and year according to vessels under contract. Use cases for an evaluation are e.g. Data Gaps, Speed Claims, Groundings, Bunker Claims.

  • The user can choose from more than 100 plots and visualizations.
  • The user specific selection of the Analysis Catalog will be evaluated.
Service Level Agreements
Basic (Tier I)
  • 48h e-mail Response / Notification
  • 1h Service Inhouse (SIH) per vessel and month for general troubleshooting included (e.g. 20 vessel = 20 h included)
Advanced (Tier II)
  • Technical Service Hotline – Working Day 9am – 5pm
  • 24h e-mail Response / Notification
  • Remote Maintenance
  • Accounting via SIH, in case of not caused by Hoppe or within warranty
UPCOMING FEATURES
Data Analyzer

Intuitive tooling to contextualize data and to make sense of it. Feature used by internal users for troubleshooting

Operational Event and Notification Service
  • Users can create rule based notifications based on health, quality, events, timeseries data processing and threshold exceedances
  • The notification is highly customizable

Which requirements must be met by the vessel’s IT infrastructure?

The Hoppe Ship-to-Shore connection does not provide any own VSAT connection. Therefore the client must ensure to have a VSAT connection available. For the communication with the land-based servers, the following IP addresses on the corresponding port for outgoing TCP data traffic of HOMIP2 must be enabled in the vessels firewall.

Primary Address – IP: 75.2.111.192 – Port: 11550
Fallback Address – IP: 99.83.166.216 – Port: 11550

A detailed checklist will be provided upon request.

Is it possible to load external or historic data into the data pool via interface?

In terms of data storage, Hoppe Marine utilizes the structure of a “Data-Lake”. It is based on a concept in which structured data can be loaded into the data pool from a variable source. In this term structured data means e.g. log data, telemetry data etc. In the field of data analysis, time series data in various formats are the most relevant data. Currently, the following formats are accepted: CSV, JSON, SQLite and Parquet. Furthermore, it is important that the client defines the column name-assignments, due to the fact that for evaluation the data must be set to a universal naming standard.

Can data and files of external partners be transmitted?

Data transmission is based on packages, their content is irrelevant. The only important thing is, that all packages need to be cryptographically signed prior to transmission to give proper evidence regarding their source. Therefore Hoppe Marine offers a crypto-software-solution on HOMIP units, capable of signing data of various sources. In this case a REST API enables access to this feature.

Currently, this feature is available only for Hoppe-internal services. The release of the interface for data transmission of external partners is part of our release plan 2021.

However, the embeddded iPC HOMIP supports several protocols for data collection. with the iDBS database solution Hoppe Marine offers a feature to make use of interfaces of other manufacturers to store the data and to make it available on shore side. This feature is available right after activation of the service.

Is the Ship-to-Shore transmission secure?

ship-to-shore-secure-2048x559

The Ship-to-Shore transmission utilizes a multi-level safety concept. This concept distinguishes between Identity Protection, Access Protection and Integrity Protection.

Identity Protection

The first level of the safety concept ensures trustworthiness of the communication partners. Thereby every end point of communication is fitted ex works with a private, cryptographic key. This key never leaves the device and cannot be compromised therefore. Only when a correct key is known the device gets enabled to transmit or receive data.

Access Protection

When it is ensured that data comes from a trustworthy source, a secure SSL-encrypted connection between the communication partners will be established in the next step. This encrypted connection prevents access from third parties, only both communication partners are able to read the data in clear format.

Integrity Protection

After successful transmission of data a further step is implemented to ensure the data is intact and corresponds to the data that has been sent from the vessel. Therefore cryptographic signatures according the industrial standard RFC 7519 are used.

Is the data protected inside the Hoppe data pool?

Our provider, responsible for data storage, is certified according to ISO/IEC 27001:2013. The certification guarantees the following core concepts of a data storage provider in terms of data protection and security:

We evaluate our IT-safety risks systematically considering the effects of threats and weaknesses.
We design and we follow a complete range of IT-security checks and other forms of risk management to manage all safety risks of company and architecture.
We introduce a complete management process to assure that our IT-security checks meet our ITsafety procedures at all times.
Accessing the data is permitted based on the so called Least-Privilege model. The access to every resource must be actively granted by Hoppe administrators, before a new user can have access to data. The access to data can be defined very detailed and tailored:

In general, for every user the access to specific vessels can be determined in details.
Furthermore, it can be determined individually or for every user whether one can have a look on raw data, or just have access to aggregated data.
Generally, the data access is only possible via a few but highly monitored channels. This eases an early detection of unauthorized access.

How is the ship data encrypted in transit from ship-to-shore? What transfer protocols are utilized?

During transport the data is encrypted via TLS from the endpoint on the ship to the endpoint in the cloud solution and during all transport inside the clouds private network. Additionally the payload data is signed with a cryptographic key (EC-512) to eliminate any chance of data manipulation in transit. Hardware on the ship needs to prove it’s identity before a connection can be established to shore. This identity management is based on EC-512 certificates. Trust is established by exchanging the public keys of the device with the shore side server during device production.

What about accessibility and backup security?

All data is protected against loss by six backup copies. Furthermore data is stored in at least two different server centres within the European Union (EU) to assure data security and data integrity even in natural disasters, like fire. In terms of constant, worldwide accessibility, the provider of the infrastructure guarantees an availability of 99.47%. For the worst case in terms of daily data transmission, this means that the access to data can only be not granted for maximum two days per year. In terms of data storage it can be assured at all times that data can be stored and catalogued appropriately and that no data gaps occur.

Can data get lost?

For data storage Hoppe Marine utilizes a model called WORMS. With WORMS all data, that has been stored once in the data pool, cannot be changed any more. This ensures that all data can neither be overwritten nor be deleted. Furthermore, each data is protected against any loss by six backup copies. And in addition to that, all data is stored in at least two different server centres within the European Union (EU) to ensure data protection and data integrity even in natural disasters, like fire.

Is it possible to transmit 1000 data points per minute from a vessel?

The chosen IT-infrastructure is designed for good horizontal scaling. This means, that during higher data volume the service resources get enhanced accordingly. Therefore a transmission of 1000 data points per minute is no problem, as long as the satellite connection offers the required bandwidth.

Do you have the capability to remotely access your ship hardware? What protocol is used to do this?

Upload from the shore side to the ship is only possible in a specific package format. The corresponding API endpoints are secured by two measures: user authentication as well as IP based whitelisting. No ingress connection can be initiated from the outside towards the ship. Only the ship can establish a connection to two fixed unicast IP addresses. No DNS resolution is involved in the process. Only approved service engineers can upload and install updates for the device on the ship and change connection parameters, but have never direct remote access (e.g. via SSH or any remote shell) to it. Only cryptographically signed update packages are accepted by the devices on the ship. The cryptographic material for signing is managed in a HashiCorp Vault implementation.

How is the data transmission working?

The data transmission is file-based. The interval for data exports can be configured. For data transmission a satellite connection is established and for transmitting the data, a direct, encrypted connection to fixed IP addresses without DNS is realized. All files are collected as database files and transmitted together in blocks. By using the Hash-method, the entire file content is subject to an integrity test. The transmission of encrypted data underlies a bandwidth regulation.

How big is the transmission overhead in addition to the transmitted amount of data?

Encryption, connection set-up, etc. have been optimized already. However, one should always consider the following: The smaller the single exported data blocks are (i.e. the shorter the export intervals are), the bigger the relative overhead is. This can be seen similar to a letter that is always 80ct., no matter if it is one A4 page or three A4 pages. The following image indicates the measuring results of the interrelation between payload size and transmission overhead. It is well indicated that the relative overhead becomes smaller in case of encryption and transmission of bigger file sizes.

What happens when the integrity test fails? Will the entire file content, in which the integrity test has been failed, be transmitted again?

Yes. But this case only happens to files where the transmission cannot be continued. In the „normal“ case of a connection failure, the transmission will continue where it got interrupted. Only in case of a high amount of package-losses, the transmission will be repeated completely. A bandwidth detection prevents any transmission, in case the risk of a weak transmission is detected. Therefore, the complete repetition of a transmission is only the last resort. The maximum amount of transmission attempts can be determined per file. Thus, expensive and endless attempts are avoided. When a better connection quality is detected, the transmission can be restarted via a web interface directly on board.

What is the origin of the timestamp for time series data?

The time series data is logged by Hoppe Marines iDB server, which uses the master time on the HOMIP2 as the time source. This time source is configured on the HOMIP2 display and is independent of the date/time of the operating system. If the customer has a GPS receiver on board, the HOMIP2 can also use the GPS time signals as the time source. GPS time signals are preferred as time source, since they provide a well established times standard. Additionally, chances to enter the date/time incorrectly are minimized.

Why is time correctness essential for meaningful high quality data

Our goal is to log and provide ship operation data accurately and consistently. This allows in retrospective to identify the exact date and time at which an event occurred or what the signal value was at that moment. This cannot be done if the time source is not reliable. As a worst case example, if the time has changed back, duplicate data can occur for the same given time range. If this occurs, meaningful report can not be generated.

How is the daily data inspection handled?

Daily evaluation of data quality is performed by a team, consisting of marine engineers, naval architects and experienced service technicians. For at least once per working day, the team visualizes and traces data losses, implausibilities and deterioration in quality on the highest aggregated level. Based on sudden events and also on gradual changes, technical clarification and troubleshooting is done. Situations which can be solved internally by analyzing the data with the Hoppe Marine Signal Inspector, Data Analysis Catalog or by remote access, as well as the communication with clients or third party providers in terms of interface troubleshooting, are part of the daily business.

In case conspicuous sensors are detected, further agreements with clients are possible to directly offer corresponding spare parts and service jobs.

What happens when a faulty sensor, an invalid signal or a failed interface is detected?

In case a signal is transmitting invalid data of a normally intact sensor, a check regarding implausible behaviour due to unfavorable environmental or operating conditions is done, to figure out whether the sensor is malfunctioning or not. For example, the alleged offset of a flow meter on a ship during standstill can be of various reasons.

In case conspicuous sensors are detected, further agreements with clients are possible to directly offer corresponding spare parts and service jobs.

How can the data quality be calculated?

The data quality is based on the quality of every single signal. One can distinguish between Primary Signals (data essential for reporting and optimization of operation) and Secondary Signals (further ship operation data for monitoring, planned maintenance, etc.) to output aggregated quality. The quality for each signal is calculated in consideration of invalid and implausible data (limit value exceedance, etc., application of algorithms).

Furthermore, with the Data Analysis Catalog a correlation matrix will be created, including the evaluation of different engineering processes and their relationship on board. The correlation has no direct influence on data quality, since in this case an evaluation for the exact determination of the faulty input variable is performed by a technician.

Can the data quality be monitored by other sensors?

Yes, all system-included sensors and values can be monitored. The detection of failed interfaces and malfunctioning sensors, as well as testing the exceedance of operationally not reachable limit values are part of the scope of performance. Normally, in-depth validation algorithms that assume knowledge about sensor behavior, can only be applied to sensors made by Hoppe Marine.

Are individual evaluations and reports possible?

The package “Operational Analyst” includes complete or customized analysis catalogs that offer in-depth information about the vessel operation. The fact sheet for the Data Analysis Catalog can be found on the hoppe-marine.com website.

Can the choosen data package be changed after initial setup of interfaces between Basic Data and Quality Data?

Yes, the subscription model can be changed and optimized afterwards according to the client’s needs. Beside this, get in contact to learn to know our SDK.

Is the provision of historical data possible?

The APIs provide all data files which are available for the specific vessel. E.g. if data recording with the embeddded iPC HOMIP2 on board started two years before contract was signed, this data can be made available on the shore side API upon customer request. Further more, see „Is it possible to load external or historic data into the data pool via interface?“ in the Ship-to-Shore FAQ.

Is the data also retrievable by Fleet Management systems / Fleet Optimization providers?

In principle, every IT-system that communicates with REST-APIs is capable of retrieving data from the Hoppe Data-Pool. Therefore interfaces to Business Intelligence Tools, like Tableau, but also Elasticsearch or classic ISO-SQL-conformal data base systems via JDBC can be provided. Please contact our Hoppe Marine system administrators for technical details.

How can the information about data quality be retrieved?

Hoppe Marine offers a simple interface for retrieving information about data quality. The interface is described in the developer documentation. Information about data quality of every single data point as well as of entire signal groups can be retrieved via this interface.

Is it possible to export data for further individual work with Excel?

Offering raw data in various formats (e.g. .csv and .xls) is planned as a new feature still in 2020. Today, raw data is offered as optimized SQLite data or in JSON format.

Where can I get information about the APIs?

Hoppe Marine offers a developer portal for clients. Via docs.hoppe-sts.com the portal can be reached worldwide 24/7. After successful registration, the client’s developer have access to all necessary API details and functionalities. Prior to first use a user account must be set up by our Hoppe Marine administrators.

How often new data will be available via API?

This behaviour is fully customer configurable. Data update rates range from two minutes to once per day. This is always a trade-off between used data volume and realtime data requirements. This configuration can be made during the early project planning phase for the initial installation. Changes in this configuration are possible later on. A typical use case with sufficient high resolution for data evaluation and fleet optimization the export logging rate of 1 min and the export interval and transmission is of 5 min.

Is it possible to have access to a sample data set, in order to replicate the live API before the vessels is online?

A representative data sample will be provided upon request. This data set is representative simulated ship data worth one month of time. The available set of signals (database columns) as well as the data aggregation for a specific vessel might differ from the sample data set. All these configurations (data aggregation rate, data export rate and alike) are at the customers disposal and can be configured and adjusted upon customer needs.

Can I get some assistance by Hoppe Marine in terms of setup and configuration, in case I have no experience in retrieving data via APIs?

Hoppe Marine is delighted to assist during the initial setup. For this purpose, an initial consultation meeting will be held to discuss all requirements and to provide possible short-term solutions for an optimized utilization of the Data-Pool-Services. After that, we can decide how Hoppe Marine can assist the client with further experiences. In any case, after the initial consultation meeting the full access to the system and interface documentation is granted.

Will the signals API be versioned? In other words, is the current API version "1" and will any changes to the signals structure and mappings be named and released separately?

Any backwards compatibility breaking changes will be announced and versioned by path versioning. For the moment we have on our development roadmap only schema additions which we take to not be breaking changes but minor releases. These will not be reflected in a version scheme.

We have tried to gain access to the signals API but we are still getting the response: "403 Forbidden" Is this expected? It would be very helpful for us to have the full signal schema so that we can integrate generally with the API.

Could you please check again that you subscribed for the Signals-API in the developer portal and that you provide your personal API key (to be obtained at https://docs.hoppe-sts.com/ dashboard) via the x-api-key header field in your GET request? Please also check that you can retrieve data via the “Try it out” functionality in the developer portal. A sample request in CURL would look like: „curl -X GET „https://api.hoppe-sts.com/signals/collections/hoppe/signals“ -H „accept: application/json“ -H „Authorization: “ -H „x-api-key:“

We have been receiving a 429 Too Many Requests response code when sending multiple requests simultaneously across several vessels. In the short term, we can limit the requests to only registered vessels that are confirmed to have data. However, as more vessels are brought onto our platform, we may run into these limits in a more significant way, especially as we will continue to utilize a parallelized workflow. What is the API specification on the rate limiting threshold and retry periods?

Yes, the APIs are rate limited per default. This is to encourage users to also use local caching technologies on slowly changing data. However, the limits are set per API key and can always be adjusted to the customer’s needs. In order to adjust the limits we would like to ask you to provide us with number about your expected usage pattern. The information we would need is:

  • regular requests/second
  • max. burst requests/second
  • requests per month.

With these numbers at hand we can adjust your account settings accordingly.

Which products are utilised and how do you manage your cloud security?

Amazon S3 and Amazon DynamoDB are used for storage. Data in S3 is encrypted at rest
using the SSE-KMS method described here. The underlying block cipher used is AES-256. For
DynamoDB also, the data is encrypted at rest as described here.
Regarding cloud security management, as there four major components:

  1. From a user management perspective for our APIs, a fine-grained Role-Based Access Control model
    that uses Amazon Cognito with Amazon API Gateway for authentication and authorization is used.
  2. Inside AWS, all our cloud resources use AWS Identity and Access Management (IAM) for
    implementing least privilege access.
  3. Hoppe uses security groups where possible to restrict access as well.
  4. Hoppe uses AWS Secrets Manager for handling secrets internally.

What is the required data storage volume which we can expect on the vessel (is the data stored on the vessel or in the cloud?)

All raw data is kept on the ship as primary data source and is stored safely on the HOMIP2 devices flash memory. With a 64GB flash drive one can expect a log retention of at least 2 years in a standard performance monitoring solution (300 signals logged every minute in average). Before we transmit the data to shore we pre-aggregate the data in order to optimize the data volume with regard to the used transmission volume. Therefore the exported data usually features lower time resolution compared to what would be available on the ship, though this is customer configurable. From this point of view the cloud storage is in most cases a down sampled mirror of the database on board.

Image
Galery

General Request

We’ll be happy to help you out with your project. Just drop us a line.

*mandatory

Thank you! We will get in touch with you shortly.