camel-spark-rest-kafka-connector source configuration

When using camel-spark-rest-kafka-connector as source make sure to use the following Maven dependency to have support for the connector:

<dependency>
  <groupId>org.apache.camel.kafkaconnector</groupId>
  <artifactId>camel-spark-rest-kafka-connector</artifactId>
  <version>x.x.x</version>
  <!-- use the same version as your Camel Kafka connector version -->
</dependency>

The camel-spark-rest source connector supports 27 options, which are listed below.

Name Description Default Priority

camel.source.path.verb

get, post, put, patch, delete, head, trace, connect, or options. One of: [get] [post] [put] [patch] [delete] [head] [trace] [connect] [options]

null

HIGH

camel.source.path.path

The content path which support Spark syntax.

null

HIGH

camel.source.endpoint.accept

Accept type such as: 'text/xml', or 'application/json'. By default we accept all kinds of types.

null

MEDIUM

camel.source.endpoint.bridgeErrorHandler

Allows for bridging the consumer to the Camel routing Error Handler, which mean any exceptions occurred while the consumer is trying to pickup incoming messages, or the likes, will now be processed as a message and handled by the routing Error Handler. By default the consumer will use the org.apache.camel.spi.ExceptionHandler to deal with exceptions, that will be logged at WARN or ERROR level and ignored.

false

MEDIUM

camel.source.endpoint.disableStreamCache

Determines whether or not the raw input stream from Spark HttpRequest#getContent() is cached or not (Camel will read the stream into a in light-weight memory based Stream caching) cache. By default Camel will cache the Netty input stream to support reading it multiple times to ensure Camel can retrieve all data from the stream. However you can set this option to true when you for example need to access the raw stream, such as streaming it directly to a file or other persistent store. Mind that if you enable this option, then you cannot read the Netty stream multiple times out of the box, and you would need manually to reset the reader index on the Spark raw stream.

false

MEDIUM

camel.source.endpoint.mapHeaders

If this option is enabled, then during binding from Spark to Camel Message then the headers will be mapped as well (eg added as header to the Camel Message as well). You can turn off this option to disable this. The headers can still be accessed from the org.apache.camel.component.sparkrest.SparkMessage message with the method getRequest() that returns the Spark HTTP request instance.

true

MEDIUM

camel.source.endpoint.transferException

If enabled and an Exchange failed processing on the consumer side, and if the caused Exception was send back serialized in the response as a application/x-java-serialized-object content type. This is by default turned off. If you enable this then be aware that Java will deserialize the incoming data from the request to Java and that can be a potential security risk.

false

MEDIUM

camel.source.endpoint.urlDecodeHeaders

If this option is enabled, then during binding from Spark to Camel Message then the header values will be URL decoded (eg %20 will be a space character.)

false

MEDIUM

camel.source.endpoint.exceptionHandler

To let the consumer use a custom ExceptionHandler. Notice if the option bridgeErrorHandler is enabled then this option is not in use. By default the consumer will deal with exceptions, that will be logged at WARN or ERROR level and ignored.

null

MEDIUM

camel.source.endpoint.exchangePattern

Sets the exchange pattern when the consumer creates an exchange. One of: [InOnly] [InOut] [InOptionalOut]

null

MEDIUM

camel.source.endpoint.basicPropertyBinding

Whether the endpoint should use basic property binding (Camel 2.x) or the newer property binding with additional capabilities

false

MEDIUM

camel.source.endpoint.matchOnUriPrefix

Whether or not the consumer should try to find a target consumer by matching the URI prefix if no exact match is found.

false

MEDIUM

camel.source.endpoint.sparkBinding

To use a custom SparkBinding to map to/from Camel message.

null

MEDIUM

camel.source.endpoint.synchronous

Sets whether synchronous processing should be strictly used, or Camel is allowed to use asynchronous processing (if supported).

false

MEDIUM

camel.component.spark-rest.bridgeErrorHandler

Allows for bridging the consumer to the Camel routing Error Handler, which mean any exceptions occurred while the consumer is trying to pickup incoming messages, or the likes, will now be processed as a message and handled by the routing Error Handler. By default the consumer will use the org.apache.camel.spi.ExceptionHandler to deal with exceptions, that will be logged at WARN or ERROR level and ignored.

false

MEDIUM

camel.component.spark-rest.ipAddress

Set the IP address that Spark should listen on. If not called the default address is '0.0.0.0'.

"0.0.0.0"

MEDIUM

camel.component.spark-rest.port

Port number. Will by default use 4567

4567

MEDIUM

camel.component.spark-rest.basicPropertyBinding

Whether the component should use basic property binding (Camel 2.x) or the newer property binding with additional capabilities

false

MEDIUM

camel.component.spark-rest.maxThreads

Maximum number of threads in Spark thread-pool (shared globally)

null

MEDIUM

camel.component.spark-rest.minThreads

Minimum number of threads in Spark thread-pool (shared globally)

null

MEDIUM

camel.component.spark-rest.sparkBinding

To use a custom SparkBinding to map to/from Camel message.

null

MEDIUM

camel.component.spark-rest.sparkConfiguration

To use the shared SparkConfiguration

null

MEDIUM

camel.component.spark-rest.timeOutMillis

Thread idle timeout in millis where threads that has been idle for a longer period will be terminated from the thread pool

null

MEDIUM

camel.component.spark-rest.keystoreFile

Configures connection to be secure to use the keystore file

null

MEDIUM

camel.component.spark-rest.keystorePassword

Configures connection to be secure to use the keystore password

null

MEDIUM

camel.component.spark-rest.truststoreFile

Configures connection to be secure to use the truststore file

null

MEDIUM

camel.component.spark-rest.truststorePassword

Configures connection to be secure to use the truststore password

null

MEDIUM