Splunk HTTP Event Collector (HEC) is a fast and efficient way to send data to Splunk Enterprise and Splunk Cloud. Notably, HEC enables you to send data over HTTP (or HTTPS) directly to Splunk Enterprise or Splunk Cloud from your application.
Checks the health status of the Splunk HTTP Event Collector to ensure it is available and ready to receive events. See the documentation
Write Python and use any of the 350k+ PyPi packages available. Refer to the Pipedream Python docs to learn more.
Sends multiple events in a single request to the Splunk HTTP Event Collector. See the documentation
Sends an event to Splunk HTTP Event Collector. See the documentation
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
splunk_http_event_collector: {
type: "app",
app: "splunk_http_event_collector",
}
},
async run({steps, $}) {
const data = {
"event": `Hello world!`,
"sourcetype": `manual`,
}
return await axios($, {
method: "post",
url: `${this.splunk_http_event_collector.$auth.api_url}:${this.splunk_http_event_collector.$auth.port}/services/collector`,
headers: {
"authorization": `Splunk ${this.splunk_http_event_collector.$auth.api_token}`,
},
params: {
channel: `2AC79941-CB26-421C-8826-F57AE23E9702`,
},
data,
})
},
})
Develop, run and deploy your Python code in Pipedream workflows. Integrate seamlessly between no-code steps, with connected accounts, or integrate Data Stores and manipulate files within a workflow
This includes installing PyPI packages, within your code without having to manage a requirements.txt
file or running pip
.
Below is an example of using Python to access data from the trigger of the workflow, and sharing it with subsequent workflow steps:
def handler(pd: "pipedream"):
# Reference data from previous steps
print(pd.steps["trigger"]["context"]["id"])
# Return data for use in future steps
return {"foo": {"test":True}}