Overview
Meraki is currently not able to provide alerts when a new client has joined the network. The current limitation is that you must pick specific client devices to receive notification devices for, thus explicitly not allowing new clients to create alerts. This guide will show you how to use results from the Meraki API to create email alerts when a new client has joined the network. See our GitHub Page for further details.
Prerequisites
Complete the steps from our previous blog post. As a review, you must have:
- Python Installed
- Visual Studio Code
- Administrator privileges of Meraki Organization
- API Key
- ORG ID
- Network ID(s)
- The NetworkIDResponse.json fille exists with a list of network IDs.
- JSON files containing a list of network clients, for each network in the Org in the format
{network_name}-response.jsonwhere network_name = the name of each network in your org. - Create a new file “
New-Clients.csv"(or get if rom our GitHub) - In this guide, I will utilize Microsoft Azure services to send email alerts. Other third-party email servers would also be acceptable but will not be covered in this guide.
- Must be an Azure administrator.
Python Dependencies:
Use PIP install command to download each Python package:
meraki pytz pandas azure.communication.email azure.identity azure.core azure.identity
Instructions
You must first obtain your organizations Meraki ORG ID, and Network ID. You can use the Get-OrgID.py script to obtain your ORG ID, and the Get-NetworkIDs.py file to get all network IDs in your organization. The Get-NetworkClients python script assumes you have successfully ran the Get-NetworkIDs.py file. See Assumptions above for more information. To use this script, you will also need to configure the following in Azure:
- Communication Resource
- Email Communication Resource
- A Verified Domain or free azure domain.
- Obtain and store your Azure Communication Resource URL in the script.
You may also use another Email solution, such as Google’s Gmail. Once you have the Get-NetworkIDs.py file ran successfully, you may run the Covene-GetNetworkClients-Email-Alert-Template.py script. You can download the new-clients.csv in this repository, for use to run along with the GetNetworkClients script.
Download the new-clients.csv from our GitHub, for use to run along with the Covene-GetNetworkClients-Email-Alert-Template-v2.py script. The new-clients.csv file is the file that gets emailed when a new client is found.
You will need to edit a few sections of the Covene-GetNetworkClients-Email-Alert-Template-v2.py script for it to run, including:
- API_Key
- org_id
- Timezone Settings
- dt = dt.replace(tzinfo=pytz.UTC)
- central = pytz.timezone(‘US/Central’)
- dt_central = dt.astimezone(central)
- current_date_central = datetime.now(central).date()
- if first_seen_date == current_date_central:
- Email Settings
- connection_string=os.environ[“Azure Communication Resource”]
- [email protected]
- “to”: [{“address”: “ENTER EMAIL YOU WANT TO SEND TO HERE” }],
Assumptions
- Email Integration setup with Microsoft Azure.
- This requires a paid account with microsoft.
- Other third-party email servers would also be acceptable but will not be covered in this script.
- See reference guides below for instructions on how to configure the Azure integration.
- Time is formatted to USA Centeral Timezone.
- You want to re-run the get network clients check every 15 minutes.
- This script will loop with the time.sleep function. You could remove this use a cron job, or windows task scheduler- which may be a better long term option.
- You have environment variables configured, that store your Meraki ORG ID, and Meraki API.
import meraki
import json
import time
import os
import pytz
import base64
import csv
import pandas as pd
from azure.communication.email import EmailClient
from datetime import datetime
#Creates a loop that the script will repeat after running through all indented code.
while True:
#Check the current size of the newclients.csv file before running the API
with open("New-Clients.csv", "r") as file:
file_contents = file.read()
file_path = "New-Clients.csv"
initial_size = os.path.getsize(file_path)
#Replace the quoted text with your environment variable name. Use API_Key="PasteAPIKeyHere" if you do not want to use an environment variable.
API_Key=os.environ["MERAKI_DASHBOARD_API_KEY"]
dashboard = meraki.DashboardAPI(API_Key)
#Edit the org ID with the company you want to pull the networks for.
org_id = os.environ["org_id"]
#Opens the networkID response from the first blogpost.
with open('NetworkIDresponse.json', 'r') as f:
networks = json.load(f)
#For each entry in the networkID response .JSON file, run an API get file for each 'name' (IE - each network).
for network in networks:
network_id = network['id']
Network_Timezone = network['timeZone']
network_name = network['name']
print(f'[API]*********Deleting contents of {network_name} Files...*********')
# Specify the file paths
file_paths = [f'{network_name}-response.json',f'{network_name}-NewClients.csv']
for file_path in file_paths:
# Check if the file exists
if os.path.exists(file_path):
# If the file exists, clear it by opening it in write mode and closing it
open(file_path, 'w').close()
else:
# If the file doesn't exist, create an empty file by opening it in write mode and closing it
open(file_path, 'w').close()
print(f'[API]********* {network_name} File conents Erased*********')
time.sleep(1)
print(f'[API]*********Running API GET for {network_name}*********')
response = dashboard.networks.getNetworkClients(
network_id, timespan =86400, perPage=5000
)
print(f'[API]*********{network_name} API Request Completed. Please wait while the .CSV output files are created.....*********')
with open(f'{network_name}-response.json', 'w') as f:
json.dump(response, f, indent=4)
# Open the Newly Created JSON file
with open(f'{network_name}-response.json', 'r') as f:
data = json.load(f)
# Open the CSV file in write mode
with open(f'{network_name}-NewClients.csv', 'w', newline='') as out:
# Create a CSV writer
writer = csv.writer(out)
# Write the header to the CSV file
writer.writerow(['Network Name', 'MAC', 'Description', 'IP Address', 'Connection Type', 'Connected To', 'SSID', 'Switchport', 'Time'])
rows = []
# Iterate over each dictionary in the list
for d in data:
timestamp = d['firstSeen']
# Convert the timestamp string to a datetime object
dt = datetime.strptime(timestamp, "%Y-%m-%dT%H:%M:%SZ")
# Set the current timezone to UTC
dt = dt.replace(tzinfo=pytz.UTC)
# Convert the datetime object to the Central Time Zone
central = pytz.timezone('US/Central')
dt_central = dt.astimezone(central)
current_date_central = datetime.now(central).date()
# Format the datetime object to a more readable format
formatted_dt = dt_central.strftime("%m-%d-%Y %H:%M:%S")
first_seen = d.get('firstSeen')
ssid = d.get('ssid')
#Checks if the first seen date = todays date. If so, add an entry to the .csv file.
if first_seen:
first_seen_date = dt_central.date()
if first_seen_date == current_date_central:
rows.append([network_name, d['mac'], d['description'], d['ip'], d['recentDeviceConnection'], d['recentDeviceName'], d['ssid'], d['switchport'], formatted_dt])
#Sorts the data so the newest is at the top of the .csv file, and saves.
rows.sort(key=lambda x: datetime.strptime(x[8], "%m-%d-%Y %H:%M:%S"), reverse=True)
for row in rows:
writer.writerow(row)
current_time = datetime.now()
formatted_time = current_time.strftime("%H:%M:%S")
# Get all csv files in the current directory
csv_files = [f for f in os.listdir() if f.endswith('.csv') and 'NewClients' in f]
# Initialize an empty DataFrame
df = pd.DataFrame()
# Iterate over each csv file
for csv_file in csv_files:
# Read the csv file
temp_df = pd.read_csv(csv_file)
# Append the data to the DataFrame
df = df._append(temp_df, ignore_index=True)
# Write the combined data to a new csv file
df.to_csv('New-Clients.csv', index=False)
# Open the CSV file in read mode
with open('New-Clients.csv', 'r') as f:
reader = csv.reader(f)
header = next(reader) # Get the header row
rows = list(reader) # Get the rest of the rows
# Find the index of the 'time' column in the header
time_index = header.index('Time')
# Sort the rows based on the 'time' column
rows.sort(key=lambda x: datetime.strptime(x[time_index], "%m-%d-%Y %H:%M:%S"), reverse=True)
# Open the CSV file in write mode
with open('New-Clients.csv', 'w', newline='') as f:
writer = csv.writer(f)
writer.writerow(header) # Write the header row
writer.writerows(rows) # Write the sorted rows
# Print a success message
print(f"{formatted_time}:[API]All csv files containing 'NewClients.csv' in the name have been combined into 'New-Clients.csv'.")
print(f"{formatted_time}:[API]Checking if file needs to be emailed...")
with open("New-Clients.csv", "r") as file:
file_contents = file.read()
file_path = "New-Clients.csv"
initial_mtime = os.path.getmtime(file_path)
current_time = datetime.now()
current_size = os.path.getsize(file_path)
current_mtime = os.path.getmtime(file_path)
if current_size > initial_size:
# File size has changed and the last update was within the last 210 seconds, send the email
with open(file_path, "r") as file:
file_contents = file.read()
file_bytes_b64 = base64.b64encode(bytes(file_contents, 'utf-8'))
print(f'[Email-Section] File change detected at {formatted_time}. Attempting to Send email...')
def main():
try:
connection_string=os.environ["Azure Communication Resource"]
client = EmailClient.from_connection_string(connection_string)
message = {
"senderAddress": "[email protected]", #UPDATE THIS to the sender email created in your azure Email Communication Service.
"recipients": {
"to": [{"address": "ENTER EMAIL YOU WANT TO SEND TO HERE" }], #UPDATE THIS to the email address you want to send to.
},
"content": {
"subject": " New Client Connected To The Network",
"plainText": f"{file_path} \n Access the Meraki Dashboard here: https://dashboard.meraki.com/ " ,
"html": f"""
<html>
<h1 style='font-size: 0.9em;'>A new client has connected to the network. See the attached file, and review in the <a href='https://dashboard.meraki.com/'>Meraki Dashboard</a></h1>
</html>
"""},
"attachments": [
{
"name": "New-Clients.csv",
"contentType": "text/html",
"contentInBase64": file_bytes_b64.decode()
}
]
}
poller = client.begin_send(message)
result = poller.result()
except Exception as ex:
print(ex)
#Add email of ex here.
else:
current_time = datetime.now()
formatted_time = current_time.strftime("%H:%M:%S")
print(f'[Email-Section] Email Sent Successfully at {formatted_time}')
main()
initial_size = current_size # Update the initial size
initial_mtime = current_mtime # Update the initial modification time
print(f"{formatted_time}:[Email-Section] Script waiting 15 minutes before checking database once more....")
time.sleep(900)
else:
current_time = datetime.now()
formatted_time = current_time.strftime("%H:%M:%S")
print(f"{formatted_time}:[Email-Section] {file_path} File Size has not changed. Supressing duplicate Email alert notification.")
print(f"{formatted_time}:[Email-Section] Script waiting 15 Minutes before checking again....")
time.sleep(900)
References
- Azure Communication Email client library for Python
- Overview of Azure Communication Services email
- Email domains and sender authentication for Azure Communication Services
- How to connect a verified email domain
- Meraki Dashboard API Python Library
- Get Network Clients – Meraki Dashboard API v1 – Cisco Meraki Developer Hub
- Meraki GitHub
Improvements To Be Made
This script in the current form will work fine for small to medium sized organizations, however it does require many file read/write operations that could be reduced heavily. With a large number of networks for an organization, the read/write operations can be burdensome. Other items that may be useful for this script include:
- Use pandas library more effectively for csv manipulation, such as built-in sorting/filtering.
- Modularization: Break down the script into functions or classes to improve modularity. This makes the code easier to read and maintain.
- Exception Handling: Implement try-except blocks to handle potential exceptions that could occur during API calls or file operations.
- Update Mechanism: Implement a mechanism to check for updates or patches to dependencies, ensuring the script remains compatible with new versions of libraries.
- As stated above, it may be more reliable to utilize windows task scheduler or cron jobs instead of the built in while true loop for continued monitoring of new clients.
Contact Us
If you have questions, would like to leave feedback, or want to discuss another topic, please contact us using one of the methods below. We look forward to speaking with you!
Phone: 314-888-2511
Email: [email protected] or [email protected]
Website: https://covene.com/contact-us/
Discover more from Covene
Subscribe to get the latest posts sent to your email.

