This notebook provides a guided example of using the Log Analytics and Splunk Data Uploader included with MSTICpy.
Contents:
You must have msticpy installed with the Azure components to run this notebook:
%pip install --upgrade msticpy[azure]
#Setup
from msticpy.init import nbinit
extra_imports = ["msticpy.data.uploaders.splunk_uploader, SplunkUploader",
"msticpy.data.uploaders.loganalytics_uploader, LAUploader"]
nbinit.init_notebook(
namespace=globals(),
extra_imports=extra_imports,
);
WIDGET_DEFAULTS = {
"layout": widgets.Layout(width="95%"),
"style": {"description_width": "initial"},
}
Processing imports.... Checking configuration.... No errors found. No warnings found. Setting options....
# Load some sample data
df = pd.read_csv('https://raw.githubusercontent.com/microsoft/msticpy/master/tests/testdata/az_net_flows.csv', parse_dates=['TimeGenerated'])
df.head(2)
Unnamed: 0 | TenantId | TimeGenerated | FlowStartTime | FlowEndTime | FlowIntervalEndTime | FlowType | ResourceGroup | VMName | VMIPAddress | PublicIPs | SrcIP | DestIP | L4Protocol | L7Protocol | DestPort | FlowDirection | AllowedOutFlows | AllowedInFlows | DeniedInFlows | DeniedOutFlows | RemoteRegion | VMRegion | AllExtIPs | TotalAllowedFlows | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 881 | 52b1ab41-869e-4138-9e40-2a4457f09bf0 | 2019-02-12 14:22:40.697 | 2019-02-12 13:00:07.000 | 2019-02-12 13:45:08.000 | 2019-02-12 14:00:00.000 | AzurePublic | asihuntomsworkspacerg | msticalertswin1 | 10.0.3.5 | ['65.55.44.109'] | NaN | NaN | T | https | 443.0 | O | 4.0 | 0.0 | 0.0 | 0.0 | eastus2 | eastus | 65.55.44.109 | 4.0 |
1 | 877 | 52b1ab41-869e-4138-9e40-2a4457f09bf0 | 2019-02-12 14:22:40.681 | 2019-02-12 13:00:48.000 | 2019-02-12 13:58:33.000 | 2019-02-12 14:00:00.000 | AzurePublic | asihuntomsworkspacerg | msticalertswin1 | 10.0.3.5 | ['13.71.172.130', '13.71.172.128'] | NaN | NaN | T | https | 443.0 | O | 18.0 | 0.0 | 0.0 | 0.0 | canadacentral | eastus | 13.71.172.128 | 18.0 |
Below we collect some details required for our uploader, instanciate our LogAnalytics data uploader and pass our DataFrame loaded above to be uploaded. We are setting the debug flag on our uploader so we can get some additional details on our upload progress.
la_ws_id = widgets.Text(description='Workspace ID:')
la_ws_key = widgets.Password(description='Workspace Key:')
display(la_ws_id)
display(la_ws_key)
Text(value='', description='Workspace ID:')
Password(description='Workspace Key:')
# Instanciate our Uploader
la_up = LAUploader(workspace=la_ws_id.value, workspace_secret=la_ws_key.value, debug=True)
# Upload our DataFrame
la_up.upload_df(data=df, table_name='upload_demo')
Upload response code: 200 Upload to upload_demo complete
We can now upload a file to our Workspace using the same Uploader. We simply pass the path to the file we want to upload, and we can also pass a table name for the data to be uploaded to.
la_up.upload_file(file_path='data/alertlist.csv', table_name='upload_demo')
Upload response code: 200 Upload to upload_demo complete
We can now upload a file to our Workspace using the same Uploader. We simply pass the the path to the folder we want to upload file from. In this case we aren't going to pass a table name, in which case the name will be generated automatically for each file from the file's name. With a folder we get a progress bar showing the progress uploading each file.
la_up.upload_folder(folder_path='data/')
HBox(children=(FloatProgress(value=0.0, description='Files', max=10.0, style=ProgressStyle(description_width='…
Upload response code: 200 Upload to alertlist complete Upload response code: 200 Upload to az_net_flows complete Upload response code: 200 Upload to demo_exchange_data complete Upload response code: 200 Upload to host_logons complete Upload response code: 200 Upload to ip_locs complete Upload response code: 200 Upload to processes_on_host complete Upload response code: 200 Upload to raw_network complete Upload response code: 200 Upload to sample_alerts complete Upload response code: 200 Upload to TimeSeriesDemo complete Upload response code: 200 Upload to ti_data complete
The Splunk Uploader functions in the same manner as the LogAnalytics one.
Below we collect some details required for our uploader, instanciate our Splunk data uploader and pass our DataFrame loaded above to be uploaded.
We are setting the debug flag on our uploader so we can get some additional details on our upload progress.
When uploading our DataFrame the only difference is that as well as providing a table name (which is represneted as sourcetype in Splunk), we also need to pass a Splunk index that we want to data uploaded to. Also as Splunk uploads data a line at a time we get a progress bar for the file as it uploads.
sp_host = widgets.Text(description='Splunk host')
sp_user = widgets.Text(description='Username')
sp_pwrd = widgets.Password(description='Password')
display(sp_host)
display(sp_user)
display(sp_pwrd)
Text(value='', description='Splunk host')
Text(value='', description='Username')
Password(description='Password')
# Instanciate our Uploader
spup = SplunkUploader(username=sp_user.value, host=sp_host.value, password=sp_pwrd.value, debug=True)
# Upload our DataFrame
spup.upload_df(data=df, table_name='upload_test', index_name='upload_test')
connected
HBox(children=(FloatProgress(value=0.0, description='Rows', max=460.0, style=ProgressStyle(description_width='…
Upload complete
We can now upload a file to our Workspace using the same Uploader. We simply pass the path to the file we want to upload along with the index name, and we can also pass a table name for the data to be uploaded to.
spup.upload_file(file_path='data/alertlist.csv', index_name='upload_demo', table_name='upload_demo')
Exception reporting mode: Verbose
HBox(children=(FloatProgress(value=0.0, description='Rows', max=189.0, style=ProgressStyle(description_width='…
Upload complete
We can now upload a file to our Workspace using the same Uploader. We simply pass the the path to the folder we want to upload file from. In this case we aren't going to pass a table name, in which case the name will be generated automatically for each file from the file's name however we still need to pass and index name.
spup.upload_folder(folder_path='data/', index_name='upload_demo')
HBox(children=(FloatProgress(value=0.0, description='Files', max=10.0, style=ProgressStyle(description_width='…
HBox(children=(FloatProgress(value=0.0, description='Rows', max=189.0, style=ProgressStyle(description_width='…
Upload complete data\alertlist.csv uploaded to alertlist
HBox(children=(FloatProgress(value=0.0, description='Rows', max=460.0, style=ProgressStyle(description_width='…
Upload complete data\az_net_flows.csv uploaded to az_net_flows
HBox(children=(FloatProgress(value=0.0, description='Rows', max=1334.0, style=ProgressStyle(description_width=…
Upload complete data\demo_exchange_data.csv uploaded to demo_exchange_data
HBox(children=(FloatProgress(value=0.0, description='Rows', max=14.0, style=ProgressStyle(description_width='i…
Upload complete data\host_logons.csv uploaded to host_logons
HBox(children=(FloatProgress(value=0.0, description='Rows', max=87.0, style=ProgressStyle(description_width='i…
Upload complete data\ip_locs.csv uploaded to ip_locs
HBox(children=(FloatProgress(value=0.0, description='Rows', max=363.0, style=ProgressStyle(description_width='…
Upload complete data\processes_on_host.csv uploaded to processes_on_host
HBox(children=(FloatProgress(value=0.0, description='Rows', max=32.0, style=ProgressStyle(description_width='i…
Upload complete data\raw_network.csv uploaded to raw_network
HBox(children=(FloatProgress(value=0.0, description='Rows', max=78.0, style=ProgressStyle(description_width='i…
Upload complete data\sample_alerts.csv uploaded to sample_alerts
HBox(children=(FloatProgress(value=0.0, description='Rows', max=840.0, style=ProgressStyle(description_width='…
Upload complete data\TimeSeriesDemo.csv uploaded to TimeSeriesDemo
HBox(children=(FloatProgress(value=0.0, description='Rows', max=80.0, style=ProgressStyle(description_width='i…
Upload complete data\ti_data.csv uploaded to ti_data