Skip to content

Commit 9c7a992

Browse files
Cleanup and bump to the latest version of Kafka. (#10)
1 parent b4123c0 commit 9c7a992

24 files changed

+258
-322
lines changed

README.md

Lines changed: 61 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,50 @@
11
This connector allows Kafka Connect to emulate a [Splunk Http Event Collector](http://dev.splunk.com/view/event-collector/SP-CAAAE6M).
22
This connector support receiving data and writing data to Splunk.
33

4-
# Source Connector
4+
# Configuration
55

6-
The Splunk Source connector allows emulates a [Splunk Http Event Collector](http://dev.splunk.com/view/event-collector/SP-CAAAE6M) to allow
7-
application that normally log to Splunk to instead write to Kafka. The goal of this plugin is to make the change nearly
8-
transparent to the user. This plugin currently has support for [X-Forwarded-For](https://en.wikipedia.org/wiki/X-Forwarded-For) so
9-
it will sit behind a load balancer nicely.
6+
## SplunkHttpSinkConnector
107

11-
## Configuration
8+
The Sink Connector will transform data from a Kafka topic into a batch of json messages that will be written via HTTP to a configured [Splunk Http Event Collector](http://dev.splunk.com/view/event-collector/SP-CAAAE6M).
9+
10+
```properties
11+
name=connector1
12+
tasks.max=1
13+
connector.class=com.github.jcustenborder.kafka.connect.splunk.SplunkHttpSinkConnector
14+
15+
# Set these required values
16+
splunk.remote.host=
17+
splunk.auth.token=
18+
```
19+
20+
| Name | Description | Type | Default | Valid Values | Importance |
21+
|---------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------|----------|----------|--------------|------------|
22+
| splunk.auth.token | The authorization token to use when writing data to splunk. | password | | | high |
23+
| splunk.remote.host | The hostname of the remote splunk host to write data do. | string | | | high |
24+
| splunk.ssl.enabled | Flag to determine if the connection to splunk should be over ssl. | boolean | true | | high |
25+
| splunk.ssl.trust.store.password | Password for the trust store. | password | [hidden] | | high |
26+
| splunk.ssl.trust.store.path | Path on the local disk to the certificate trust store. | string | "" | | high |
27+
| splunk.remote.port | Port on the remote splunk server to write to. | int | 8088 | | medium |
28+
| splunk.ssl.validate.certs | Flag to determine if ssl connections should validate the certificateof the remote host. | boolean | true | | medium |
29+
| splunk.connect.timeout.ms | The maximum amount of time for a connection to be established. | int | 20000 | | low |
30+
| splunk.curl.logging.enabled | Flag to determine if requests to Splunk should be logged in curl form. This will output a curl command to replicate the call to Splunk. | boolean | false | | low |
31+
| splunk.read.timeout.ms | Sets the timeout in milliseconds to read data from an established connection or 0 for an infinite timeout. | int | 30000 | | low |
32+
33+
## SplunkHttpSourceConnector
34+
35+
The Splunk Source connector allows emulates a [Splunk Http Event Collector](http://dev.splunk.com/view/event-collector/SP-CAAAE6M) to allow application that normally log to Splunk to instead write to Kafka. The goal of this plugin is to make the change nearly transparent to the user. This plugin currently has support for [X-Forwarded-For](https://en.wikipedia.org/wiki/X-Forwarded-For) so it will sit behind a load balancer nicely.
36+
37+
```properties
38+
name=connector1
39+
tasks.max=1
40+
connector.class=com.github.jcustenborder.kafka.connect.splunk.SplunkHttpSourceConnector
41+
42+
# Set these required values
43+
splunk.ssl.key.store.password=
44+
splunk.collector.index.default=
45+
splunk.ssl.key.store.path=
46+
kafka.topic=
47+
```
1248

1349
| Name | Description | Type | Default | Valid Values | Importance |
1450
|----------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------|---------------------------|--------------|------------|
@@ -24,33 +60,30 @@ it will sit behind a load balancer nicely.
2460
| splunk.collector.url | Path fragement the servlet should respond on | string | /services/collector/event | | low |
2561
| splunk.ssl.renegotiation.allowed | Flag to determine if ssl renegotiation is allowed. | boolean | true | | low |
2662

27-
### Example Config
2863

29-
```
30-
name=splunk-http-source
31-
tasks.max=1
32-
connector.class=com.github.jcustenborder.kafka.connect.splunk.SplunkHttpSourceConnector
33-
splunk.ssl.key.store.path=/etc/security/keystore.jks
34-
splunk.ssl.key.store.password=password
35-
splunk.collector.index.default=main
36-
```
64+
# Schemas
65+
66+
## com.github.jcustenborder.kafka.connect.splunk.EventKey
67+
68+
This schema represents the key for the data received from the Splunk listener.
69+
70+
| Name | Optional | Schema | Default Value | Documentation |
71+
|------|----------|-------------------------------------------------------------------------------------------------------|---------------|--------------------------------------------------------------------------------------------------------------------------|
72+
| host | false | [String](https://kafka.apache.org/0102/javadoc/org/apache/kafka/connect/data/Schema.Type.html#STRING) | | The host value to assign to the event data. This is typically the hostname of the client from which you're sending data. |
3773

38-
# Sink Connector
74+
## com.github.jcustenborder.kafka.connect.splunk.Event
3975

40-
The Sink Connector will transform data from a Kafka topic into a batch of json messages that will be written via HTTP to
41-
a configured [Splunk Http Event Collector](http://dev.splunk.com/view/event-collector/SP-CAAAE6M).
76+
This schema represents the data received from the Splunk listener.
4277

43-
## Configuration
78+
| Name | Optional | Schema | Default Value | Documentation |
79+
|------------|----------|-------------------------------------------------------------------------------------------------------|---------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
80+
| time | true | [Timestamp](https://kafka.apache.org/0102/javadoc/org/apache/kafka/connect/data/Timestamp.html) | | The event time. |
81+
| host | true | [String](https://kafka.apache.org/0102/javadoc/org/apache/kafka/connect/data/Schema.Type.html#STRING) | | The host value to assign to the event data. This is typically the hostname of the client from which you're sending data. |
82+
| source | true | [String](https://kafka.apache.org/0102/javadoc/org/apache/kafka/connect/data/Schema.Type.html#STRING) | | The source value to assign to the event data. For example, if you're sending data from an app you're developing, you could set this key to the name of the app. |
83+
| sourcetype | true | [String](https://kafka.apache.org/0102/javadoc/org/apache/kafka/connect/data/Schema.Type.html#STRING) | | The sourcetype value to assign to the event data. |
84+
| index | true | [String](https://kafka.apache.org/0102/javadoc/org/apache/kafka/connect/data/Schema.Type.html#STRING) | | The name of the index by which the event data is to be indexed. The index you specify here must within the list of allowed indexes if the token has the indexes parameter set. |
85+
| event | true | [String](https://kafka.apache.org/0102/javadoc/org/apache/kafka/connect/data/Schema.Type.html#STRING) | | This is the event it's self. This is the serialized json form. It could be an object or a string. |
4486

45-
| Name | Description | Type | Default | Valid Values | Importance |
46-
|---------------------------------|-----------------------------------------------------------------------------------------|----------|----------|--------------|------------|
47-
| splunk.auth.token | The authorization token to use when writing data to splunk. | password | | | high |
48-
| splunk.remote.host | The hostname of the remote splunk host to write data do. | string | | | high |
49-
| splunk.ssl.enabled | Flag to determine if the connection to splunk should be over ssl. | boolean | true | | high |
50-
| splunk.ssl.trust.store.password | Password for the trust store. | password | [hidden] | | high |
51-
| splunk.ssl.trust.store.path | Path on the local disk to the certificate trust store. | string | "" | | high |
52-
| splunk.remote.port | Port on the remote splunk server to write to. | int | 8088 | | medium |
53-
| splunk.ssl.validate.certs | Flag to determine if ssl connections should validate the certificateof the remote host. | boolean | true | | medium |
5487

5588
### Example Config
5689

pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323
<parent>
2424
<groupId>com.github.jcustenborder.kafka.connect</groupId>
2525
<artifactId>kafka-connect-parent</artifactId>
26-
<version>0.10.1.0-cp1</version>
26+
<version>0.10.2.0-cp1</version>
2727
</parent>
2828
<artifactId>kafka-connect-splunk</artifactId>
2929
<version>0.2.0-SNAPSHOT</version>

src/main/java/com/github/jcustenborder/kafka/connect/splunk/EventConverter.java

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
* you may not use this file except in compliance with the License.
66
* You may obtain a copy of the License at
77
*
8-
* http://www.apache.org/licenses/LICENSE-2.0
8+
* http://www.apache.org/licenses/LICENSE-2.0
99
*
1010
* Unless required by applicable law or agreed to in writing, software
1111
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -34,12 +34,14 @@
3434
class EventConverter {
3535
public static final Schema KEY_SCHEMA = SchemaBuilder.struct()
3636
.name("com.github.jcustenborder.kafka.connect.splunk.EventKey")
37+
.doc("This schema represents the key for the data received from the Splunk listener.")
3738
.field("host", SchemaBuilder.string().doc("The host value to assign to the event data. " +
3839
"This is typically the hostname of the client from which you're sending data.").build())
3940
.build();
4041

4142
public static final Schema VALUE_SCHEMA = SchemaBuilder.struct()
4243
.name("com.github.jcustenborder.kafka.connect.splunk.Event")
44+
.doc("This schema represents the data received from the Splunk listener.")
4345
.field("time", Timestamp.builder().optional().doc("The event time.").build())
4446
.field("host", SchemaBuilder.string().optional().doc("The host value to assign to the event data. " +
4547
"This is typically the hostname of the client from which you're sending data.").build())
@@ -68,11 +70,11 @@ class EventConverter {
6870

6971
EventConverter(SplunkHttpSourceConnectorConfig config) {
7072
this.config = config;
71-
this.topicPerIndex = this.config.topicPerIndex();
72-
this.topicPrefix = this.config.topicPrefix();
73+
this.topicPerIndex = this.config.topicPerIndex;
74+
this.topicPrefix = this.config.topicPrefix;
7375
this.indexToTopicLookup = new ConcurrentSkipListMap<>(String.CASE_INSENSITIVE_ORDER);
7476
this.topic = this.topicPerIndex ? null : this.topicPrefix;
75-
this.defaultIndex = this.config.defaultIndex();
77+
this.defaultIndex = this.config.defaultIndex;
7678
}
7779

7880
static <T> void setFieldValue(JsonNode messageNode, Struct struct, String fieldName, Class<T> cls) {

src/main/java/com/github/jcustenborder/kafka/connect/splunk/EventServlet.java

Lines changed: 4 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
* you may not use this file except in compliance with the License.
66
* You may obtain a copy of the License at
77
*
8-
* http://www.apache.org/licenses/LICENSE-2.0
8+
* http://www.apache.org/licenses/LICENSE-2.0
99
*
1010
* Unless required by applicable law or agreed to in writing, software
1111
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -48,7 +48,7 @@ public void configure(SplunkHttpSourceConnectorConfig config, JsonFactory jsonFa
4848
this.jsonFactory = jsonFactory;
4949
this.converter = new EventConverter(this.config);
5050
this.recordQueue = recordQueue;
51-
this.allowedIndexes = this.config.allowedIndexes();
51+
this.allowedIndexes = this.config.allowedIndexes;
5252
}
5353

5454
@Override
@@ -76,9 +76,7 @@ public String host(HttpServletRequest request) {
7676

7777
@Override
7878
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
79-
if (log.isInfoEnabled()) {
80-
log.info("Reading message body.");
81-
}
79+
log.trace("Reading message body.");
8280

8381
response.setHeader("X-Content-Type-Options", "nosniff");
8482
response.setHeader("X-Frame-Options", "SAMEORIGIN");
@@ -116,10 +114,7 @@ protected void doPost(HttpServletRequest request, HttpServletResponse response)
116114
response.setStatus(200);
117115

118116
} catch (Exception ex) {
119-
if (log.isErrorEnabled()) {
120-
log.error("Exception thrown", ex);
121-
}
122-
117+
log.error("Exception thrown", ex);
123118
response.setStatus(500);
124119
}
125120
}

src/main/java/com/github/jcustenborder/kafka/connect/splunk/ObjectMapperFactory.java

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
* you may not use this file except in compliance with the License.
66
* You may obtain a copy of the License at
77
*
8-
* http://www.apache.org/licenses/LICENSE-2.0
8+
* http://www.apache.org/licenses/LICENSE-2.0
99
*
1010
* Unless required by applicable law or agreed to in writing, software
1111
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -43,6 +43,13 @@
4343
class ObjectMapperFactory {
4444

4545
public static final ObjectMapper INSTANCE;
46+
static final Set<String> RESERVED_METADATA = ImmutableSet.of(
47+
"time",
48+
"host",
49+
"source",
50+
"sourcetype",
51+
"index"
52+
);
4653

4754
static {
4855
ObjectMapper mapper = new ObjectMapper();
@@ -85,14 +92,6 @@ public void serialize(Date date, JsonGenerator jsonGenerator, SerializerProvider
8592
}
8693
}
8794

88-
static final Set<String> RESERVED_METADATA = ImmutableSet.of(
89-
"time",
90-
"host",
91-
"source",
92-
"sourcetype",
93-
"index"
94-
);
95-
9695
static class StructSerializer extends JsonSerializer<Struct> {
9796

9897
@Override

src/main/java/com/github/jcustenborder/kafka/connect/splunk/SplunkHttpSinkConnector.java

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
* you may not use this file except in compliance with the License.
66
* You may obtain a copy of the License at
77
*
8-
* http://www.apache.org/licenses/LICENSE-2.0
8+
* http://www.apache.org/licenses/LICENSE-2.0
99
*
1010
* Unless required by applicable law or agreed to in writing, software
1111
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,6 +15,8 @@
1515
*/
1616
package com.github.jcustenborder.kafka.connect.splunk;
1717

18+
import com.github.jcustenborder.kafka.connect.utils.VersionUtil;
19+
import com.github.jcustenborder.kafka.connect.utils.config.Description;
1820
import org.apache.kafka.common.config.ConfigDef;
1921
import org.apache.kafka.connect.connector.Task;
2022
import org.apache.kafka.connect.sink.SinkConnector;
@@ -25,14 +27,16 @@
2527
import java.util.List;
2628
import java.util.Map;
2729

30+
@Description("The Sink Connector will transform data from a Kafka topic into a batch of json messages that will be written via HTTP to " +
31+
"a configured [Splunk Http Event Collector](http://dev.splunk.com/view/event-collector/SP-CAAAE6M).")
2832
public class SplunkHttpSinkConnector extends SinkConnector {
2933
private static Logger log = LoggerFactory.getLogger(SplunkHttpSinkConnector.class);
3034
Map<String, String> settings;
3135
private SplunkHttpSinkConnectorConfig config;
3236

3337
@Override
3438
public String version() {
35-
return VersionUtil.getVersion();
39+
return VersionUtil.version(this.getClass());
3640
}
3741

3842
@Override

0 commit comments

Comments
 (0)