mirror of
https://github.com/jokob-sk/NetAlertX.git
synced 2025-12-07 01:26:11 -08:00
Compare commits
70 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
508b0c2d83 | ||
|
|
b354e72489 | ||
|
|
90bfa70d1b | ||
|
|
7561a8478d | ||
|
|
b5afdb2bce | ||
|
|
5207162d0a | ||
|
|
8eecc54217 | ||
|
|
ddfd0d3cb3 | ||
|
|
bcc5b2f28a | ||
|
|
2e6be21cd9 | ||
|
|
abb28c4e5b | ||
|
|
44f0ba0924 | ||
|
|
a6f5e6c499 | ||
|
|
d992edf6b4 | ||
|
|
d7ba540377 | ||
|
|
30c95f0d5e | ||
|
|
b670e3a8b1 | ||
|
|
467e24d167 | ||
|
|
7b70d61dd8 | ||
|
|
d530576e9b | ||
|
|
cff6f6393d | ||
|
|
f33d753cc1 | ||
|
|
6809688623 | ||
|
|
68de633143 | ||
|
|
a3e21ac17d | ||
|
|
c8267f75fa | ||
|
|
9e6a52ca4b | ||
|
|
13c68efb8a | ||
|
|
47a3f7073b | ||
|
|
8e0eb6a480 | ||
|
|
911c897b00 | ||
|
|
b9650d3cf5 | ||
|
|
3c959a7920 | ||
|
|
5e170da542 | ||
|
|
63932fb5bc | ||
|
|
741c0f9ede | ||
|
|
08abbabaad | ||
|
|
65c8f81afd | ||
|
|
80958c2e3f | ||
|
|
233873704d | ||
|
|
90322c4747 | ||
|
|
57e6a330be | ||
|
|
0f86b05ce5 | ||
|
|
9dd3a0a2d1 | ||
|
|
20f847c6d8 | ||
|
|
8cd20ab343 | ||
|
|
de5dfa9d06 | ||
|
|
19fe6d53d5 | ||
|
|
37fa7fe8a8 | ||
|
|
5ec13d89ec | ||
|
|
a0a5410af9 | ||
|
|
b234e1c859 | ||
|
|
cd761a058f | ||
|
|
bf137a9755 | ||
|
|
81cfa72b72 | ||
|
|
c15b5bba5c | ||
|
|
c7913c389f | ||
|
|
fc8d17788a | ||
|
|
ff72b45f7c | ||
|
|
692cf9305d | ||
|
|
790e98d8a7 | ||
|
|
0bd985282f | ||
|
|
1e75eeab4c | ||
|
|
a0d34876cc | ||
|
|
c14fa5606d | ||
|
|
aab910f68a | ||
|
|
b9a7516eb8 | ||
|
|
5cf453d4fb | ||
|
|
ff40a5acc0 | ||
|
|
64d6f8be92 |
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@@ -1,20 +0,0 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
title: ''
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
38
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Executable file
38
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Executable file
@@ -0,0 +1,38 @@
|
||||
name: Feature Request
|
||||
description: 'Suggest an idea for PiAlert'
|
||||
labels: ['Feature request➕']
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: Is there an existing issue for this?
|
||||
description: Please search to see if an open or closed issue already exists for the feature you are requesting.
|
||||
options:
|
||||
- label: I have searched the existing open and closed issues
|
||||
required: true
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Is your feature request related to a problem? Please describe
|
||||
description: A clear and concise description of what the problem is.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Describe the solution you'd like
|
||||
description: A clear and concise description of what you want to happen.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Describe alternatives you've considered
|
||||
description: A clear and concise description of any alternative solutions or features you've considered.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Anything else?
|
||||
description: |
|
||||
Links? References? Mockups? Anything that will give us more context about the feature you are encountering!
|
||||
|
||||
Tip: You can attach images or log files by clicking this area to highlight it and then dragging files in.
|
||||
validations:
|
||||
required: true
|
||||
46
.github/ISSUE_TEMPLATE/i-have-an-issue.md
vendored
46
.github/ISSUE_TEMPLATE/i-have-an-issue.md
vendored
@@ -1,46 +0,0 @@
|
||||
---
|
||||
name: I have an issue
|
||||
about: Describe this issue template's purpose here.
|
||||
title: ''
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
## Describe the issue
|
||||
|
||||
> When submitting an issue ❗[enable debug](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/DEBUG_TIPS.md)❗ and [have a look at the docs](https://github.com/jokob-sk/Pi.Alert/tree/main/docs)
|
||||
|
||||
[describe your issue]
|
||||
|
||||
## Paste your `pialert.conf` (remove personal info)
|
||||
|
||||
```
|
||||
paste_here
|
||||
```
|
||||
|
||||
## Paste your `docker-compose.yml` and `.env` (remove personal info)
|
||||
|
||||
`docker-compose.yml`
|
||||
|
||||
```
|
||||
paste_here
|
||||
```
|
||||
|
||||
`.env`
|
||||
|
||||
```
|
||||
paste_here
|
||||
```
|
||||
|
||||
## Screenshots
|
||||
|
||||
[If applicable, add screenshots to help explain your problem.]
|
||||
|
||||
## Paste last few lines from `pialert.log`
|
||||
|
||||
> You can use `tail -100 /home/pi/pialert/front/log/pialert.log`
|
||||
|
||||
```bash
|
||||
|
||||
# paste code below
|
||||
76
.github/ISSUE_TEMPLATE/i-have-an-issue.yml
vendored
Executable file
76
.github/ISSUE_TEMPLATE/i-have-an-issue.yml
vendored
Executable file
@@ -0,0 +1,76 @@
|
||||
name: Bug Report
|
||||
description: 'When submitting an issue enable debug and have a look at the docs.'
|
||||
labels: ['bug 🐛']
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: Is there an existing issue for this?
|
||||
description: Please search to see if an open or closed issue already exists for the bug you encountered.
|
||||
options:
|
||||
- label: I have searched the existing open and closed issues and I checked the docs https://github.com/jokob-sk/Pi.Alert/tree/main/docs
|
||||
required: true
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Current Behavior
|
||||
description: A concise description of what you're experiencing.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Expected Behavior
|
||||
description: A concise description of what you expected to happen.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Steps To Reproduce
|
||||
description: Steps to reproduce the behavior.
|
||||
placeholder: |
|
||||
1. With these settings...
|
||||
2. With this config...
|
||||
3. Run '...'
|
||||
4. See error...
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: pialert.conf
|
||||
description: |
|
||||
Paste your `pialert.conf` (remove personal info)
|
||||
render: python
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: docker-compose.yml
|
||||
description: |
|
||||
Paste your `docker-compose.yml`
|
||||
render: python
|
||||
validations:
|
||||
required: false
|
||||
- type: dropdown
|
||||
attributes:
|
||||
label: What branch are you running?
|
||||
options:
|
||||
- Production
|
||||
- Dev
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: pialert.log
|
||||
description: |
|
||||
Logs with debug enabled (https://github.com/jokob-sk/Pi.Alert/blob/main/docs/DEBUG_TIPS.md) ⚠
|
||||
***Generally speaking, all bug reports should have logs provided.***
|
||||
Tip: You can attach images or log files by clicking this area to highlight it and then dragging files in.
|
||||
Additionally, any additional info? Screenshots? References? Anything that will give us more context about the issue you are encountering!
|
||||
You can use `tail -100 /home/pi/pialert/front/log/pialert.log` in teh container if you have troubles getting to the log files.
|
||||
validations:
|
||||
required: false
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: Debug enabled
|
||||
description: I confirm I enabled `debug`
|
||||
options:
|
||||
- label: I have read and followed the steps in the wiki link above and provided the required debug logs and the log section covers the time when the issue occurs.
|
||||
required: true
|
||||
11
.github/workflows/docker_dev.yml
vendored
11
.github/workflows/docker_dev.yml
vendored
@@ -74,13 +74,18 @@ jobs:
|
||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
|
||||
# # Disable this after use
|
||||
# - name: Prune Docker Builder
|
||||
# run: docker builder prune --force
|
||||
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@v3
|
||||
with:
|
||||
context: .
|
||||
platforms: linux/amd64,linux/arm64,linux/arm/v7
|
||||
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6
|
||||
push: ${{ github.event_name != 'pull_request' }}
|
||||
tags: ${{ steps.meta.outputs.tags }}
|
||||
labels: ${{ steps.meta.outputs.labels }}
|
||||
cache-from: type=registry,ref=ghcr.io/jokob-sk/pi.alert:buildcache
|
||||
cache-to: type=registry,ref=ghcr.io/jokob-sk/pi.alert:buildcache,mode=max
|
||||
# # ⚠ disable cache if build is failing to download debian packages
|
||||
# cache-from: type=registry,ref=ghcr.io/jokob-sk/pi.alert:buildcache
|
||||
# cache-to: type=registry,ref=ghcr.io/jokob-sk/pi.alert:buildcache,mode=max
|
||||
|
||||
7
.github/workflows/docker_prod.yml
vendored
7
.github/workflows/docker_prod.yml
vendored
@@ -76,9 +76,10 @@ jobs:
|
||||
uses: docker/build-push-action@v3
|
||||
with:
|
||||
context: .
|
||||
platforms: linux/amd64,linux/arm64,linux/arm/v7
|
||||
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6
|
||||
push: ${{ github.event_name != 'pull_request' }}
|
||||
tags: ${{ steps.meta.outputs.tags }}
|
||||
labels: ${{ steps.meta.outputs.labels }}
|
||||
cache-from: type=registry,ref=ghcr.io/jokob-sk/pi.alert:buildcache
|
||||
cache-to: type=registry,ref=ghcr.io/jokob-sk/pi.alert:buildcache,mode=max
|
||||
# # ⚠ disable cache if build is failing to download debian packages
|
||||
# cache-from: type=registry,ref=ghcr.io/jokob-sk/pi.alert:buildcache
|
||||
# cache-to: type=registry,ref=ghcr.io/jokob-sk/pi.alert:buildcache,mode=max
|
||||
|
||||
@@ -6,7 +6,7 @@ The issue tracker is the preferred channel for bug reports, features requests an
|
||||
|
||||
Before submitting a new issue please spend a couple of minutes on research:
|
||||
|
||||
* Check [🛑 Common issues](https://github.com/jokob-sk/Pi.Alert/tree/main/dockerfiles#-common-issues)
|
||||
* Check [🛑 Common issues](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/DEBUG_TIPS.md#common-issues)
|
||||
* Check [💡 Closed issues](https://github.com/jokob-sk/Pi.Alert/issues?q=is%3Aissue+is%3Aclosed) if a similar issue was solved in the past.
|
||||
|
||||
## Pull-requests (PRs)
|
||||
|
||||
15
README.md
15
README.md
@@ -1,6 +1,6 @@
|
||||
# 💻🔍 Network security scanner
|
||||
# 💻🔍 Network security scanner & notification framework
|
||||
|
||||
Scans for devices, port changes on your WIFI/LAN and alerts you if unknown devices or changes are found.
|
||||
Get visibility of what's going on on your WIFI/LAN network. Scan for devices, port changes and get alerts if unknown devices or changes are found. Write your own [Plugins](https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins#readme) with auto-generated UI and in-build notification system.
|
||||
|
||||
[](https://github.com/jokob-sk/Pi.Alert/actions/workflows/docker_prod.yml)
|
||||
[](https://github.com/jokob-sk/Pi.Alert)
|
||||
@@ -13,15 +13,15 @@ Scans for devices, port changes on your WIFI/LAN and alerts you if unknown devic
|
||||
|
||||
## Why PiAlert❓
|
||||
|
||||
Most of us don't know what's going on on our home network, but we want our family and data to _be safe_. _Command-line tools_ are great, but the output can be _hard to understand_ and action if you are not a network specialist 😖.
|
||||
Most of us don't know what's going on on our home network, but we want our family and data to be safe. _Command-line tools_ are great, but the output can be _hard to understand_ and action if you are not a network specialist.
|
||||
|
||||
PiAlert gives you peace of mind. _Visualize and immediately report 📬_ what is going on in your network - this is the first step to enhance your _network security 🔐_.
|
||||
|
||||
_PiAlert combines several network and other scanning tools 🔍 with notifications 📧 into one user-friendly package 📦_. You get an overview of network device Sessions, Connected devices, Favorites, Events, Presence, Down alerts, and IPs. You can schedule Nmap scans to detect changes in device ports and visualize your Network topology (even with undetectable, dummy devices).
|
||||
_PiAlert combines several network and other scanning tools 🔍 with notifications 📧 into one user-friendly package 📦_. You get an overview of network device Sessions, Connected devices, Events, Presence, Down alerts, and IPs. You can schedule Nmap scans to detect changes in device ports and visualize your Network topology (even with undetectable, dummy devices).
|
||||
|
||||
Setup a _kill switch ☠_ for your network via a smart plug with the available [Home Assistant](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/HOME_ASSISTANT.md) integration. Implement custom automations with the [CSV device Exports 📤](https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins/csv_backup), [Webhooks](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/WEBHOOK_N8N.md), or [API endpoints](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/API.md) features.
|
||||
|
||||
Extend the app if you want to create your own scanner and handle the results and notifications in PiAlert. Check available [Plugins & Instructions](https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins).
|
||||
Extend the app if you want to create your own scanner [Plugin](https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins#readme) and handle the results and notifications in PiAlert.
|
||||
|
||||
Looking forward to your contributions if you decide to share your work with the community ❤.
|
||||
|
||||
@@ -34,11 +34,10 @@ Looking forward to your contributions if you decide to share your work with the
|
||||
|
||||
| Features | Details |
|
||||
|-------------|-------------|
|
||||
| 🔍 | The app scans your network for, **New devices**, **New connections** (re-connections), **Disconnections**, **"Always Connected" devices down**, Devices **IP changes** and **Internet IP address changes**. Discovery & scan methods include: **arp-scan**. **Pi-hole - DB import**, **Pi-hole - DHCP leases import**, **Generic DHCP leases import**. **UNIFI controller import**, **SNMP-enabled router import**. Check the [Plugins](https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins) docs for more info on individual scans. |
|
||||
| 🔍 | The app scans your network for, **New devices**, **New connections** (re-connections), **Disconnections**, **"Always Connected" devices down**, Devices **IP changes** and **Internet IP address changes**. Discovery & scan methods include: **arp-scan**. **Pi-hole - DB import**, **Pi-hole - DHCP leases import**, **Generic DHCP leases import**. **UNIFI controller import**, **SNMP-enabled router import**. Check the [Plugins](https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins#readme) docs for more info on individual scans. |
|
||||
|📧 | Send notifications to more than 80+ services, including Telegram via [Apprise](https://hub.docker.com/r/caronc/apprise), or use [Pushsafer](https://www.pushsafer.com/), or [NTFY](https://ntfy.sh/). |
|
||||
|🧩 | Feed your data and device changes into [Home Assistant](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/HOME_ASSISTANT.md), read [API endpoints](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/API.md), or use [Webhooks](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/WEBHOOK_N8N.md) to setup custom automation flows. |
|
||||
|➕ | Build your own scanners with the [Plugin system](https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins) |
|
||||
|
||||
|➕ | Build your own scanners with the [Plugin system](https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins#readme) |
|
||||
|
||||
|
||||
## Installation & Documentation
|
||||
|
||||
@@ -21,7 +21,6 @@ SCAN_SUBNETS=['192.168.1.0/24 --interface=eth1']
|
||||
TIMEZONE='Europe/Berlin'
|
||||
PIALERT_WEB_PROTECTION=False
|
||||
PIALERT_WEB_PASSWORD='8d969eef6ecad3c29a3a629280e686cf0c3f5d5a86aff3ca12020c923adc6c92'
|
||||
INCLUDED_SECTIONS=['internet','new_devices','down_devices','events']
|
||||
DAYS_TO_KEEP_EVENTS=90
|
||||
# Used for generating links in emails. Make sure not to add a trailing slash!
|
||||
REPORT_DASHBOARD_URL='http://pi.alert'
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
#!/bin/sh
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Pi.Alert
|
||||
# Open Source Network Guard / WIFI & LAN intrusion detector
|
||||
@@ -20,18 +21,15 @@ echo "---------------------------------------------------------"
|
||||
|
||||
# ----------------------------------------------------------------------
|
||||
echo Updating... /usr/share/ieee-data/
|
||||
cd /usr/share/ieee-data/
|
||||
cd /usr/share/ieee-data/ || { echo "could not enter /usr/share/ieee-data directory"; exit 1; }
|
||||
|
||||
sudo mkdir -p 2_backup
|
||||
sudo cp *.txt 2_backup
|
||||
sudo cp *.csv 2_backup
|
||||
sudo cp -- *.txt 2_backup
|
||||
sudo cp -- *.csv 2_backup
|
||||
echo ""
|
||||
echo Download Start
|
||||
echo ""
|
||||
sudo curl $1 -LO https://standards-oui.ieee.org/iab/iab.csv \
|
||||
-LO https://standards-oui.ieee.org/iab/iab.txt \
|
||||
-LO https://standards-oui.ieee.org/oui28/mam.csv \
|
||||
-LO https://standards-oui.ieee.org/iab/iab.txt \
|
||||
sudo curl "$1" -LO https://standards-oui.ieee.org/oui28/mam.csv \
|
||||
-LO https://standards-oui.ieee.org/oui28/mam.csv \
|
||||
-LO https://standards-oui.ieee.org/oui28/mam.txt \
|
||||
-LO https://standards-oui.ieee.org/oui36/oui36.csv \
|
||||
@@ -44,10 +42,10 @@ echo Download Finished
|
||||
# ----------------------------------------------------------------------
|
||||
echo ""
|
||||
echo Updating... /usr/share/arp-scan/
|
||||
cd /usr/share/arp-scan
|
||||
cd /usr/share/arp-scan || { echo "could not enter /usr/share/arp-scan directory"; exit 1; }
|
||||
|
||||
sudo mkdir -p 2_backup
|
||||
sudo cp *.txt 2_backup
|
||||
sudo cp -- *.txt 2_backup
|
||||
|
||||
# Update from /usb/lib/ieee-data
|
||||
sudo get-iab -v
|
||||
|
||||
@@ -17,7 +17,17 @@
|
||||
"title": "Pi.Alert Notifications",
|
||||
"title_link": "",
|
||||
"text": {
|
||||
"internet": [],
|
||||
"new_devices_meta": {
|
||||
"title": "New devices",
|
||||
"columnNames": [
|
||||
"MAC",
|
||||
"Datetime",
|
||||
"IP",
|
||||
"Event Type",
|
||||
"Device name",
|
||||
"Comments"
|
||||
]
|
||||
},
|
||||
"new_devices": [
|
||||
{
|
||||
"MAC": "74:ac:74:ac:74:ac",
|
||||
@@ -29,7 +39,29 @@
|
||||
"Device Vendor": null
|
||||
}
|
||||
],
|
||||
"down_devices_meta": {
|
||||
"title": "Down devices",
|
||||
"columnNames": [
|
||||
"MAC",
|
||||
"Datetime",
|
||||
"IP",
|
||||
"Event Type",
|
||||
"Device name",
|
||||
"Comments"
|
||||
]
|
||||
},
|
||||
"down_devices": [],
|
||||
"events_meta": {
|
||||
"title": "Events",
|
||||
"columnNames": [
|
||||
"MAC",
|
||||
"Datetime",
|
||||
"IP",
|
||||
"Event Type",
|
||||
"Device name",
|
||||
"Comments"
|
||||
]
|
||||
},
|
||||
"events": [
|
||||
{
|
||||
"MAC": "74:ac:74:ac:74:ac",
|
||||
@@ -50,6 +82,20 @@
|
||||
"Device Vendor": null
|
||||
}
|
||||
],
|
||||
"plugins_meta": {
|
||||
"title": "Plugins",
|
||||
"columnNames": [
|
||||
"Plugin",
|
||||
"Object_PrimaryID",
|
||||
"Object_SecondaryID",
|
||||
"DateTimeChanged",
|
||||
"Watched_Value1",
|
||||
"Watched_Value2",
|
||||
"Watched_Value3",
|
||||
"Watched_Value4",
|
||||
"Status"
|
||||
]
|
||||
},
|
||||
"plugins": [
|
||||
{
|
||||
"Index": 138,
|
||||
|
||||
@@ -29,6 +29,7 @@ services:
|
||||
- ${APP_DATA_LOCATION}/pialert/php.ini:/etc/php/8.2/fpm/php.ini
|
||||
- ${DEV_LOCATION}/install:/home/pi/pialert/install
|
||||
- ${DEV_LOCATION}/front/css:/home/pi/pialert/front/css
|
||||
- ${DEV_LOCATION}/back/update_vendors.sh:/home/pi/pialert/back/update_vendors.sh
|
||||
- ${DEV_LOCATION}/front/lib/AdminLTE:/home/pi/pialert/front/lib/AdminLTE
|
||||
- ${DEV_LOCATION}/front/js:/home/pi/pialert/front/js
|
||||
- ${DEV_LOCATION}/dockerfiles/start.sh:/home/pi/pialert/dockerfiles/start.sh
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
[](https://hub.docker.com/r/jokobsk/pi.alert)
|
||||
[](https://hub.docker.com/r/jokobsk/pi.alert)
|
||||
|
||||
# PiAlert 💻🔍 Network security scanner
|
||||
# PiAlert 💻🔍 Network security scanner & notification framework
|
||||
|
||||
| 🐳 [Docker hub](https://registry.hub.docker.com/r/jokobsk/pi.alert) | 📑 [Docker guide](https://github.com/jokob-sk/Pi.Alert/blob/main/dockerfiles/README.md) |🆕 [Release notes](https://github.com/jokob-sk/Pi.Alert/releases) | 📚 [All Docs](https://github.com/jokob-sk/Pi.Alert/tree/main/docs) |
|
||||
|----------------------|----------------------| ----------------------| ----------------------|
|
||||
@@ -18,7 +18,7 @@
|
||||
|
||||
## 📕 Basic Usage
|
||||
|
||||
- You will have to run the container on the host network, e.g:
|
||||
- You will have to run the container on the `host` network, e.g:
|
||||
|
||||
```yaml
|
||||
docker run -d --rm --network=host \
|
||||
@@ -27,8 +27,8 @@ docker run -d --rm --network=host \
|
||||
-e TZ=Europe/Berlin \
|
||||
-e PORT=20211 \
|
||||
jokobsk/pi.alert:latest
|
||||
```
|
||||
- The initial scan can take up-to 15min (with 50 devices and MQTT). Subsequent ones 3 and 5 minutes so wait that long for all of the scans to run.
|
||||
```
|
||||
- The initial scan can take up to 15min (with 50 devices and MQTT). Subsequent ones 3 and 5 minutes so wait that long for all of the scans to run.
|
||||
|
||||
### Docker environment variables
|
||||
|
||||
@@ -42,22 +42,22 @@ docker run -d --rm --network=host \
|
||||
|
||||
### Docker paths
|
||||
|
||||
| | Path | Description |
|
||||
| Required | Path | Description |
|
||||
| :------------- | :------------- |:-------------|
|
||||
| **Required** | `:/home/pi/pialert/config` | Folder which will contain the `pialert.conf` file (see below for details) |
|
||||
| **Required** | `:/home/pi/pialert/db` | Folder which will contain the `pialert.db` file |
|
||||
|Optional| `:/home/pi/pialert/front/log` | Logs folder useful for debugging if you have issues setting up the container |
|
||||
|Optional| `:/etc/pihole/pihole-FTL.db` | PiHole's `pihole-FTL.db` database file. Required if you want to use PiHole |
|
||||
|Optional| `:/etc/pihole/dhcp.leases` | PiHole's `dhcp.leases` file. Required if you want to use PiHole `dhcp.leases` file. This has to be matched with a corresponding `DHCPLSS_paths_to_check` setting entry. (the path in the container must contain `pihole`)|
|
||||
|Optional| `:/home/pi/pialert/front/api` | A simple [API endpoint](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/API.md) containing static (but regularly updated) json and other files. |
|
||||
|Optional| `:/home/pi/pialert/front/plugins/<plugin>/ignore_plugin` | Map a file `ignore_plugin` to ignore a plugin. Plugins can be soft-disabled via settings. More in the [Plugin docs](/front/plugins/README.md). |
|
||||
| ✅ | `:/home/pi/pialert/config` | Folder which will contain the `pialert.conf` file (see below for details) |
|
||||
| ✅ | `:/home/pi/pialert/db` | Folder which will contain the `pialert.db` file |
|
||||
| | `:/home/pi/pialert/front/log` | Logs folder useful for debugging if you have issues setting up the container |
|
||||
| | `:/etc/pihole/pihole-FTL.db` | PiHole's `pihole-FTL.db` database file. Required if you want to use PiHole |
|
||||
| | `:/etc/pihole/dhcp.leases` | PiHole's `dhcp.leases` file. Required if you want to use PiHole `dhcp.leases` file. This has to be matched with a corresponding `DHCPLSS_paths_to_check` setting entry. (the path in the container must contain `pihole`)|
|
||||
| | `:/home/pi/pialert/front/api` | A simple [API endpoint](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/API.md) containing static (but regularly updated) json and other files. |
|
||||
| | `:/home/pi/pialert/front/plugins/<plugin>/ignore_plugin` | Map a file `ignore_plugin` to ignore a plugin. Plugins can be soft-disabled via settings. More in the [Plugin docs](/front/plugins/README.md). |
|
||||
|
||||
|
||||
### Config (`pialert.conf`)
|
||||
### Modify the config (`pialert.conf`) only if UI is not available
|
||||
|
||||
- If unavailable, the app generates a default `pialert.conf` and `pialert.db` file on the first run.
|
||||
- The preferred way is to manage the configuration via the Settings section in the UI.
|
||||
- You can modify [pialert.conf](https://github.com/jokob-sk/Pi.Alert/tree/main/config) directly, if needed.
|
||||
- If unavailable, the app generates a default `pialert.conf` and `pialert.db` file on the first run.
|
||||
|
||||
#### Important settings
|
||||
|
||||
@@ -82,6 +82,20 @@ There are 2 approaches how to get PiHole devices imported. Via the PiHole import
|
||||
|
||||
> [!NOTE]
|
||||
> It's recommended to use the same schedule interval for all plugins responsible for discovering new devices.
|
||||
|
||||
|
||||
#### 🧭 Community guides
|
||||
|
||||
> Use the official installation guides at first and use community content as suplementary material. Open an issue if you'd like to add your link to the list 🙏
|
||||
|
||||
- 📄 [How to Install Pi.Alert on Your Synology NAS - Marius hosting (English)](https://mariushosting.com/how-to-install-pi-alert-on-your-synology-nas/) (Updated frequently)
|
||||
- 📄 [시놀/헤놀에서 네트워크 스캐너 Pi.Alert Docker로 설치 및 사용하기 (Korean)](https://blog.dalso.org/article/%EC%8B%9C%EB%86%80-%ED%97%A4%EB%86%80%EC%97%90%EC%84%9C-%EB%84%A4%ED%8A%B8%EC%9B%8C%ED%81%AC-%EC%8A%A4%EC%BA%90%EB%84%88-pi-alert-docker%EB%A1%9C-%EC%84%A4%EC%B9%98-%EB%B0%8F-%EC%82%AC%EC%9A%A9) (July 2023)
|
||||
- 📄 [网络入侵探测器Pi.Alert (Chinese)](https://codeantenna.com/a/VgUvIAjZ7J) (May 2023)
|
||||
- ▶ [Pi.Alert auf Synology & Docker by - Jürgen Barth (German)](https://www.youtube.com/watch?v=-ouvA2UNu-A) (March 2023)
|
||||
- ▶ [Top Docker Container for Home Server Security - VirtualizationHowto (English)](https://www.youtube.com/watch?v=tY-w-enLF6Q) (March 2023)
|
||||
- ▶ [Pi.Alert or WatchYourLAN can alert you to unknown devices appearing on your WiFi or LAN network - Danie van der Merwe (English)](https://www.youtube.com/watch?v=v6an9QG2xF0) (November 2022)
|
||||
|
||||
> Ordered by last update time.
|
||||
|
||||
### **Common issues**
|
||||
|
||||
@@ -89,7 +103,10 @@ There are 2 approaches how to get PiHole devices imported. Via the PiHole import
|
||||
|
||||
⚠ Check also common issues and [debugging tips](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/DEBUG_TIPS.md).
|
||||
|
||||
## 📄 Examples
|
||||
> [!NOTE]
|
||||
> You can bulk-update devices via the [CSV import method](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/DEVICES_BULK_EDITING.md).
|
||||
|
||||
## 📄 docker-compose.yml Examples
|
||||
|
||||
### Example 1
|
||||
|
||||
@@ -213,11 +230,7 @@ Courtesy of [pbek](https://github.com/pbek). The volume `pialert_db` is used by
|
||||
|
||||
## 🏅 Recognitions
|
||||
|
||||
Big thanks to <a href="https://github.com/Macleykun">@Macleykun</a> for help and tips&tricks for Dockerfile(s):
|
||||
|
||||
<a href="https://github.com/Macleykun">
|
||||
<img src="https://avatars.githubusercontent.com/u/26381427?size=50">
|
||||
</a>
|
||||
Big thanks to <a href="https://github.com/Macleykun">@Macleykun</a> for help and tips&tricks for Dockerfile(s).
|
||||
|
||||
## ❤ Support me
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
#!/bin/bash
|
||||
#!/usr/bin/env bash
|
||||
|
||||
echo "---------------------------------------------------------"
|
||||
echo "[INSTALL] Run start.sh"
|
||||
@@ -7,6 +7,13 @@ echo "---------------------------------------------------------"
|
||||
|
||||
INSTALL_DIR=/home/pi # Specify the installation directory here
|
||||
|
||||
# DO NOT CHANGE ANYTHING BELOW THIS LINE!
|
||||
WEB_UI_DIR=/var/www/html/pialert
|
||||
NGINX_CONFIG_FILE=/etc/nginx/conf.d/pialert.conf
|
||||
OUI_FILE="/usr/share/arp-scan/ieee-oui.txt" # Define the path to ieee-oui.txt and ieee-iab.txt
|
||||
FILEDB=$INSTALL_DIR/pialert/db/pialert.db
|
||||
# DO NOT CHANGE ANYTHING ABOVE THIS LINE!
|
||||
|
||||
# if custom variables not set we do not need to do anything
|
||||
if [ -n "${TZ}" ]; then
|
||||
FILECONF=$INSTALL_DIR/pialert/config/pialert.conf
|
||||
@@ -29,38 +36,50 @@ echo "[INSTALL] Run setup scripts"
|
||||
"$INSTALL_DIR/pialert/dockerfiles/user-mapping.sh"
|
||||
"$INSTALL_DIR/pialert/install/install_dependencies.sh" # if modifying this file transfer the chanegs into the root Dockerfile as well!
|
||||
|
||||
# Change port number if set
|
||||
if [ -n "${PORT}" ]; then
|
||||
sed -ie 's/listen 20211/listen '${PORT}'/g' /etc/nginx/sites-available/default
|
||||
fi
|
||||
|
||||
echo "[INSTALL] Setup NGINX"
|
||||
|
||||
# Remove /html folder if exists
|
||||
sudo rm -R /var/www/html
|
||||
# Remove default NGINX site if it is symlinked, or backup it otherwise
|
||||
if [ -L /etc/nginx/sites-enabled/default ] ; then
|
||||
echo "Disabling default NGINX site, removing sym-link in /etc/nginx/sites-enabled"
|
||||
sudo rm /etc/nginx/sites-enabled/default
|
||||
elif [ -f /etc/nginx/sites-enabled/default ]; then
|
||||
echo "Disabling default NGINX site, moving config to /etc/nginx/sites-available"
|
||||
sudo mv /etc/nginx/sites-enabled/default /etc/nginx/sites-available/default.bkp_pialert
|
||||
fi
|
||||
|
||||
# Clear existing directories and files
|
||||
if [ -d $WEB_UI_DIR ]; then
|
||||
echo "Removing existing PiAlert web-UI"
|
||||
sudo rm -R $WEB_UI_DIR
|
||||
fi
|
||||
|
||||
if [ -f $NGINX_CONFIG_FILE ]; then
|
||||
echo "Removing existing PiAlert NGINX config"
|
||||
sudo rm $NGINX_CONFIG_FILE
|
||||
fi
|
||||
|
||||
# create symbolic link to the pialert install directory
|
||||
ln -s $INSTALL_DIR/pialert/front /var/www/html
|
||||
# remove dfault NGINX site
|
||||
sudo rm /etc/nginx/sites-available/default
|
||||
ln -s $INSTALL_DIR/pialert/front $WEB_UI_DIR
|
||||
# create symbolic link to NGINX configuaration coming with PiAlert
|
||||
sudo ln -s "$INSTALL_DIR/pialert/install/default" /etc/nginx/sites-available/default
|
||||
# use user-supplied port
|
||||
sudo sed -i 's/listen 80/listen '"$PORT"'/g' /etc/nginx/sites-available/default
|
||||
sudo ln -s "$INSTALL_DIR/pialert/install/pialert.conf" /etc/nginx/conf.d/pialert.conf
|
||||
|
||||
# Use user-supplied port if set
|
||||
if [ -n "${PORT}" ]; then
|
||||
echo "Setting webserver to user-supplied port ($PORT)"
|
||||
sudo sed -i 's/listen 20211/listen '"$PORT"'/g' /etc/nginx/conf.d/pialert.conf
|
||||
fi
|
||||
|
||||
# Change web interface address if set
|
||||
if [ -n "${LISTEN_ADDR}" ]; then
|
||||
sed -ie 's/listen /listen '${LISTEN_ADDR}:'/g' /etc/nginx/sites-available/default
|
||||
if [ -n "${LISTEN_ADDR}" ]; then
|
||||
echo "Setting webserver to user-supplied address ($LISTEN_ADDR)"
|
||||
sed -ie 's/listen /listen '"${LISTEN_ADDR}":'/g' /etc/nginx/conf.d/pialert.conf
|
||||
fi
|
||||
|
||||
# Run the hardware vendors update at least once
|
||||
echo "[INSTALL] Run the hardware vendors update"
|
||||
|
||||
# Define the path to ieee-oui.txt and ieee-iab.txt
|
||||
oui_file="/usr/share/arp-scan/ieee-oui.txt"
|
||||
|
||||
# Check if ieee-oui.txt or ieee-iab.txt exist
|
||||
if [ -f "$oui_file" ]; then
|
||||
if [ -f "$OUI_FILE" ]; then
|
||||
echo "The file ieee-oui.txt exists. Skipping update_vendors.sh..."
|
||||
else
|
||||
echo "The file ieee-oui.txt does not exist. Running update_vendors..."
|
||||
@@ -76,23 +95,26 @@ fi
|
||||
# Fixing file permissions
|
||||
echo "[INSTALL] Fixing file permissions"
|
||||
|
||||
echo "[INSTALL] Fixing WEB_UI_DIR: $WEB_UI_DIR"
|
||||
|
||||
chmod -R a+rwx $WEB_UI_DIR
|
||||
|
||||
echo "[INSTALL] Fixing INSTALL_DIR: $INSTALL_DIR"
|
||||
|
||||
chmod -R a+rwx /var/www/html
|
||||
chmod -R a+rw $INSTALL_DIR/pialert/front/log
|
||||
chmod -R a+rwx $INSTALL_DIR
|
||||
|
||||
FILEDB=$INSTALL_DIR/pialert/db/pialert.db
|
||||
|
||||
if [ -f "$FILEDB" ]; then
|
||||
chown -R www-data:www-data $INSTALL_DIR/pialert/db/pialert.db
|
||||
fi
|
||||
|
||||
echo "[INSTALL] Copy starter pialert.db and pialert.conf if they don't exist"
|
||||
|
||||
# Copy starter pialert.db and pialert.conf if they don't exist
|
||||
cp -n "$INSTALL_DIR/pialert/back/pialert.conf" "$INSTALL_DIR/pialert/config/pialert.conf"
|
||||
cp -n "$INSTALL_DIR/pialert/back/pialert.db" "$INSTALL_DIR/pialert/db/pialert.db"
|
||||
cp -n "$INSTALL_DIR/pialert/back/pialert.db" "$FILEDB"
|
||||
|
||||
echo "[INSTALL] Fixing permissions after copied starter config & DB"
|
||||
|
||||
if [ -f "$FILEDB" ]; then
|
||||
chown -R www-data:www-data $FILEDB
|
||||
fi
|
||||
|
||||
chmod -R a+rwx $INSTALL_DIR # second time after we copied the files
|
||||
chmod -R a+rw $INSTALL_DIR/pialert/config
|
||||
@@ -104,7 +126,6 @@ if [ ! -f "$INSTALL_DIR/pialert/front/buildtimestamp.txt" ]; then
|
||||
date +%s > "$INSTALL_DIR/pialert/front/buildtimestamp.txt"
|
||||
fi
|
||||
|
||||
|
||||
# start PHP
|
||||
/etc/init.d/php8.2-fpm start
|
||||
/etc/init.d/nginx start
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
#!/bin/bash
|
||||
#!/usr/bin/env bash
|
||||
|
||||
echo "---------------------------------------------------------"
|
||||
echo "[INSTALL] Run user-mapping.sh"
|
||||
@@ -9,7 +9,7 @@ if [ -z "${USER}" ]; then
|
||||
fi
|
||||
|
||||
# if both not set we do not need to do anything
|
||||
if [ -z "${HOST_USER_ID}" -a -z "${HOST_USER_GID}" ]; then
|
||||
if [ -z "${HOST_USER_ID}" ] && [ -z "${HOST_USER_GID}" ]; then
|
||||
echo "Nothing to do here." ; exit 0
|
||||
fi
|
||||
|
||||
@@ -20,20 +20,20 @@ USER_GID=${HOST_USER_GID:=$USER_GID}
|
||||
|
||||
LINE=$(grep -F "${USER}" /etc/passwd)
|
||||
# replace all ':' with a space and create array
|
||||
array=( ${LINE//:/ } )
|
||||
array=( "${LINE//:/ }" )
|
||||
|
||||
# home is 5th element
|
||||
USER_HOME=${array[4]}
|
||||
|
||||
# print debug output
|
||||
echo USER_ID : ${USER_ID};
|
||||
echo USER_GID : ${USER_GID};
|
||||
echo USER_HOME: ${USER_HOME};
|
||||
echo TZ : ${TZ};
|
||||
echo USER_ID" ": "${USER_ID}";
|
||||
echo USER_GID : "${USER_GID}";
|
||||
echo USER_HOME: "${USER_HOME}";
|
||||
echo TZ" ": "${TZ}";
|
||||
|
||||
sed -i -e "s/^${USER}:\([^:]*\):[0-9]*:[0-9]*/${USER}:\1:${USER_ID}:${USER_GID}/" /etc/passwd
|
||||
sed -i -e "s/^${USER}:\([^:]*\):[0-9]*/${USER}:\1:${USER_GID}/" /etc/group
|
||||
|
||||
chown -R ${USER_ID}:${USER_GID} ${USER_HOME}
|
||||
chown -R "${USER_ID}:${USER_GID} ${USER_HOME}"
|
||||
|
||||
exec su - "${USER}"
|
||||
exec su - "${USER}"
|
||||
|
||||
@@ -12,7 +12,7 @@
|
||||
| CurrentScan | Result of the current scan | ![Screen1][screen1] |
|
||||
| Devices | The main devices database that also contains the Network tree mappings. If `ScanCycle` is set to `0` device is not scanned. | ![Screen2][screen2] |
|
||||
| Events | Used to collect connection/disconnection events. | ![Screen4][screen4] |
|
||||
| Online_History | Used to display the `Device presence over time` chart | ![Screen6][screen6] |
|
||||
| Online_History | Used to display the `Device presence` chart | ![Screen6][screen6] |
|
||||
| Parameters | Used to pass values between the frontend and backend. | ![Screen7][screen7] |
|
||||
| Pholus_Scan | Scan results of the Pholus python network penetration script. | ![Screen8][screen8] |
|
||||
| Plugins_Events | For capturing events exposed by a plugin via the `last_result.log` file. If unique then saved into the `Plugins_Objects` table. Entries are deleted once processed and stored in the `Plugins_History` and/or `Plugins_Objects` tables. | ![Screen10][screen10] |
|
||||
|
||||
@@ -8,22 +8,27 @@ Check the the HTTP response of the failing backend call by following these steps
|
||||
![F12DeveloperConsole][F12DeveloperConsole]
|
||||
|
||||
- Copy the URL causing the error and enter it in the address bar of your browser directly and hit enter. The copied URLs could look something like this (notice the query strings at the end):
|
||||
- `http://<pialert URL>:20211/api/table_devices.json?nocache=1704141103121`
|
||||
- `http://<pialert URL>:20211/php/server/devices.php?action=getDevicesTotals`
|
||||
- `http://<pialert URL>:20211/php/server/devices.php?action=getDevicesList&status=all`
|
||||
- `http://<pialert URL>:20211/php/server/devices.php?action=getDevicesList&status=all`
|
||||
|
||||
- Post the error response in the existing issue thread on GitHub or create a new issue and include the redacted response of the failing query.
|
||||
|
||||
For reference, the above queries should return results in the following format:
|
||||
|
||||
First URL:
|
||||
## First URL:
|
||||
|
||||
- Should yield a valid JSON file
|
||||
|
||||
## Second URL:
|
||||
|
||||
![array][array]
|
||||
|
||||
Second URL:
|
||||
## Third URL:
|
||||
|
||||
![json][json]
|
||||
|
||||
You can copy and paste any JSON result (result of the second query) into an online JSON checker, such as [this one](https://jsonchecker.com/) to check if it's valid.
|
||||
You can copy and paste any JSON result (result of the First and Third query) into an online JSON checker, such as [this one](https://jsonchecker.com/) to check if it's valid.
|
||||
|
||||
|
||||
[F12DeveloperConsole]: ./img/DEBUG/Invalid_JSON_repsonse_debug.png "F12DeveloperConsole"
|
||||
|
||||
74
docs/DEBUG_PLUGINS.md
Executable file
74
docs/DEBUG_PLUGINS.md
Executable file
@@ -0,0 +1,74 @@
|
||||
# Troubleshooting plugins
|
||||
|
||||
## High-level overview
|
||||
|
||||
If a Plugin supplies data to the main app it's doine either vie a SQL query or via a script that updates the `last_result.log` file in the plugin folder (`front/plugins/<plugin>`).
|
||||
|
||||
For a more in-depth overview on how plugins work check the [Plugins development docs](https://github.com/jokob-sk/Pi.Alert/blob/main/front/plugins/README.md).
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Make sure you read and followed the specific plugin setup instructions.
|
||||
- Ensure you have [debug enabled (see More Logging)](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/DEBUG_TIPS.md#1-more-logging-)
|
||||
|
||||
### Potential issues
|
||||
|
||||
- Bugs
|
||||
- Unexpected input (e.g. special characters in names)
|
||||
- Dependencies changed how data is output
|
||||
|
||||
#### Incorrect input data
|
||||
|
||||
Input data from the plugin might cause mapping issues in specific edge cases. Look for a corresponding section in the `pialert.log` file, for example notice the first line of the execution run of the `PIHOLE` plugin below:
|
||||
|
||||
```
|
||||
17:31:05 [Scheduler] - Scheduler run for PIHOLE: YES
|
||||
17:31:05 [Plugin utils] ---------------------------------------------
|
||||
17:31:05 [Plugin utils] display_name: PiHole (Device sync)
|
||||
17:31:05 [Plugins] CMD: SELECT n.hwaddr AS Object_PrimaryID, {s-quote}null{s-quote} AS Object_SecondaryID, datetime() AS DateTime, na.ip AS Watched_Value1, n.lastQuery AS Watched_Value2, na.name AS Watched_Value3, n.macVendor AS Watched_Value4, {s-quote}null{s-quote} AS Extra, n.hwaddr AS ForeignKey FROM EXTERNAL_PIHOLE.Network AS n LEFT JOIN EXTERNAL_PIHOLE.Network_Addresses AS na ON na.network_id = n.id WHERE n.hwaddr NOT LIKE {s-quote}ip-%{s-quote} AND n.hwaddr is not {s-quote}00:00:00:00:00:00{s-quote} AND na.ip is not null
|
||||
17:31:05 [Plugins] setTyp: subnets
|
||||
17:31:05 [Plugin utils] Flattening the below array
|
||||
17:31:05 ['192.168.1.0/24 --interface=eth1']
|
||||
17:31:05 [Plugin utils] isinstance(arr, list) : False | isinstance(arr, str) : True
|
||||
17:31:05 [Plugins] Resolved value: 192.168.1.0/24 --interface=eth1
|
||||
17:31:05 [Plugins] Convert to Base64: True
|
||||
17:31:05 [Plugins] base64 value: b'MTkyLjE2OC4xLjAvMjQgLS1pbnRlcmZhY2U9ZXRoMQ=='
|
||||
17:31:05 [Plugins] Timeout: 10
|
||||
17:31:05 [Plugins] Executing: SELECT n.hwaddr AS Object_PrimaryID, 'null' AS Object_SecondaryID, datetime() AS DateTime, na.ip AS Watched_Value1, n.lastQuery AS Watched_Value2, na.name AS Watched_Value3, n.macVendor AS Watched_Value4, 'null' AS Extra, n.hwaddr AS ForeignKey FROM EXTERNAL_PIHOLE.Network AS n LEFT JOIN EXTERNAL_PIHOLE.Network_Addresses AS na ON na.network_id = n.id WHERE n.hwaddr NOT LIKE 'ip-%' AND n.hwaddr is not '00:00:00:00:00:00' AND na.ip is not null
|
||||
17:31:05 [Plugins] SUCCESS, received 2 entries
|
||||
17:31:05 [Plugins] sqlParam entries: [(0, 'PIHOLE', '01:01:01:01:01:01', 'null', 'null', '2023-12-25 06:31:05', '172.30.0.1', 0, 'aaaa', 'vvvvvvvvv', 'not-processed', 'null', 'null', '01:01:01:01:01:01'), (0, 'PIHOLE', '02:42:ac:1e:00:02', 'null', 'null', '2023-12-25 06:31:05', '172.30.0.2', 0, 'dddd', 'vvvvv2222', 'not-processed', 'null', 'null', '02:42:ac:1e:00:02')]
|
||||
17:31:05 [Plugins] Processing : PIHOLE
|
||||
17:31:05 [Plugins] Existing objects from Plugins_Objects: 4
|
||||
17:31:05 [Plugins] Logged events from the plugin run : 2
|
||||
17:31:05 [Plugins] pluginEvents count: 2
|
||||
17:31:05 [Plugins] pluginObjects count: 4
|
||||
17:31:05 [Plugins] events_to_insert count: 0
|
||||
17:31:05 [Plugins] history_to_insert count: 4
|
||||
17:31:05 [Plugins] objects_to_insert count: 0
|
||||
17:31:05 [Plugins] objects_to_update count: 4
|
||||
17:31:05 [Plugin utils] In pluginEvents there are 2 events with the status "watched-not-changed"
|
||||
17:31:05 [Plugin utils] In pluginObjects there are 2 events with the status "missing-in-last-scan"
|
||||
17:31:05 [Plugin utils] In pluginObjects there are 2 events with the status "watched-not-changed"
|
||||
17:31:05 [Plugins] Mapping objects to database table: CurrentScan
|
||||
17:31:05 [Plugins] SQL query for mapping: INSERT into CurrentScan ( "cur_MAC", "cur_IP", "cur_LastQuery", "cur_Name", "cur_Vendor", "cur_ScanMethod") VALUES ( ?, ?, ?, ?, ?, ?)
|
||||
17:31:05 [Plugins] SQL sqlParams for mapping: [('01:01:01:01:01:01', '172.30.0.1', 0, 'aaaa', 'vvvvvvvvv', 'PIHOLE'), ('02:42:ac:1e:00:02', '172.30.0.2', 0, 'dddd', 'vvvvv2222', 'PIHOLE')]
|
||||
17:31:05 [API] Update API starting
|
||||
17:31:06 [API] Updating table_plugins_history.json file in /front/api
|
||||
```
|
||||
|
||||
In the above output notice the section logging how many events are produced by the plugin:
|
||||
|
||||
```
|
||||
17:31:05 [Plugins] Existing objects from Plugins_Objects: 4
|
||||
17:31:05 [Plugins] Logged events from the plugin run : 2
|
||||
17:31:05 [Plugins] pluginEvents count: 2
|
||||
17:31:05 [Plugins] pluginObjects count: 4
|
||||
17:31:05 [Plugins] events_to_insert count: 0
|
||||
17:31:05 [Plugins] history_to_insert count: 4
|
||||
17:31:05 [Plugins] objects_to_insert count: 0
|
||||
17:31:05 [Plugins] objects_to_update count: 4
|
||||
```
|
||||
|
||||
These values, if formatted correctly, will also show up in the UI:
|
||||
|
||||

|
||||
@@ -56,7 +56,7 @@ services:
|
||||
* If facing issues (AJAX errors, can't write to DB, empty screen, etc,) make sure permissions are set correctly, and check the logs under `/home/pi/pialert/front/log`.
|
||||
* To solve permission issues you can try setting the owner and group of the `pialert.db` by executing the following on the host system: `docker exec pialert chown -R www-data:www-data /home/pi/pialert/db/pialert.db`.
|
||||
* Map to local User and Group IDs. Specify the enviroment variables `HOST_USER_ID` and `HOST_USER_GID` if needed.
|
||||
* If still facing issues, try to map the pialert.db file (⚠ not folder) to `:/home/pi/pialert/db/pialert.db` (see Examples below for details)
|
||||
* If still facing issues, try to map the pialert.db file (⚠ not folder) to `:/home/pi/pialert/db/pialert.db` (see [docker-compose Examples](https://github.com/jokob-sk/Pi.Alert/blob/main/dockerfiles/README.md#-docker-composeyml-examples) for details)
|
||||
|
||||
### Container restarts / crashes
|
||||
|
||||
|
||||
@@ -1,20 +1,44 @@
|
||||
# How to install PiAlert on the server hardware
|
||||
|
||||
To download and install PiAlert on the hardware/server directly use `curl` or `wget` commands.
|
||||
To download and install PiAlert on the hardware/server directly use the `curl` or `wget` commands at the bottom of this page.
|
||||
|
||||
> [!NOTE]
|
||||
> This is an Experimental feature 🧪 and it relies on community support.
|
||||
> [!NOTE]
|
||||
> This is an Experimental feature 🧪 and it relies on community support.
|
||||
>
|
||||
> There is no guarantee that the install script or any other script will gracefully handle other installed software.
|
||||
> Data loss is a possibility, **it is recommended to install PiAlert using the supplied Docker image**.
|
||||
|
||||
A warning to the installation method below: Piping to bash is [controversial](https://pi-hole.net/2016/07/25/curling-and-piping-to-bash) and may
|
||||
be dangerous, as you cannot see the code that's about to be executed on your system.
|
||||
|
||||
Alternatively you can download the installation script `install/install.sh` from the repository and check the code yourself (beware other scripts are
|
||||
downloaded too - only from this repo).
|
||||
|
||||
PiAlert will be installed in `home/pi/pialert/` and run on port number `20211`.
|
||||
|
||||
## CURL
|
||||
Some facts about what and where something will be changed/installed by the HW install setup (may not contain everything!):
|
||||
|
||||
- `/home/pi/pialert` directory will be deleted and newly created
|
||||
- `/home/pi/pialert` will contain the whole repository (downloaded by `install/install.sh`)
|
||||
- The default NGINX site `/etc/nginx/sites-enabled/default` will be disabled (sym-link deleted or backed up to `sites-available`)
|
||||
- `/var/www/html/pialert` directory will be deleted and newly created
|
||||
- `/etc/nginx/conf.d/pialert.conf` will be sym-linked to `/home/pi/pialert/install/pialert.conf`
|
||||
- Some files (IEEE device vendors info, ...) will be created in the directory where the installation script is executed
|
||||
|
||||
## Limitations
|
||||
|
||||
- No system service is provided. PiAlert must be started using `/home/pi/pialert/dockerfiles/start.sh`.
|
||||
- No checks for other running software is done.
|
||||
- Only tested to work on Debian Bookworm (Debian 12).
|
||||
- **EXPERIMENTAL** and not recommended way to install PiAlert.
|
||||
|
||||
## 📥 Installation via CURL
|
||||
|
||||
```bash
|
||||
curl -o install.sh https://raw.githubusercontent.com/jokob-sk/Pi.Alert/main/install/install.sh && sudo chmod +x install.sh && sudo ./install.sh
|
||||
```
|
||||
|
||||
## WGET
|
||||
|
||||
## 📥 Installation via WGET
|
||||
|
||||
```bash
|
||||
wget https://raw.githubusercontent.com/jokob-sk/Pi.Alert/main/install/install.sh -O install.sh && sudo chmod +x install.sh && sudo ./install.sh
|
||||
@@ -22,4 +46,4 @@ wget https://raw.githubusercontent.com/jokob-sk/Pi.Alert/main/install/install.sh
|
||||
|
||||
These commands will download the `install.sh` script from the GitHub repository, make it executable with `chmod`, and then run it using `./install.sh`.
|
||||
|
||||
Make sure you have the necessary permissions to execute the script.
|
||||
Make sure you have the necessary permissions to execute the script.
|
||||
|
||||
@@ -8,12 +8,11 @@ Make sure you have a root device with the MAC `Internet` (No other MAC addresses
|
||||
|
||||
## ⚡Quick setup:
|
||||
|
||||
* Go to Devices > Device Details.
|
||||
* Find the device(s) you want to use as network devices (network nodes).
|
||||
* Set the Type of such a device to one of the following: AP, Firewall, Gateway, PLC, Powerline, Router, Switch, USB LAN Adapter, USB WIFI Adapter and WLAN.
|
||||
* Go to a Device you want to use as network device (network nodes, such as a Switch).
|
||||
* Set the **Type** of such a device to one of the following: AP, Firewall, Gateway, PLC, Powerline, Router, Switch, USB LAN Adapter, USB WIFI Adapter and WLAN (you can create a custom network type device with in Settings -> General -> `NETWORK_DEVICE_TYPES`).
|
||||
* Save and go to Network where the devices you've marked as network devices (by selecting the Type as mentioned above) will show up as tabs.
|
||||
* You can now assign the Unassigend devices to the correct network node.
|
||||
* If port is empty or 0 a wifi icon is rendered, otherwise a ethernet port icon
|
||||
* You can now assign the Unassigend devices to the network node.
|
||||
* If port is empty or 0 a wifi icon is rendered, otherwise a ethernet port icon.
|
||||
|
||||
|
||||
> [!NOTE]
|
||||
@@ -46,7 +45,7 @@ In this example you will setup a device named `rapberrypi` as a `Switch` in our
|
||||

|
||||
|
||||
- Notice the newly added `raspberrypi` (2) tab which now represents a network node, also showing up in the tree (3).
|
||||
- As we asssigned the `raspberrypi` in the previous 1) Device details page section to the `Internet` parent network node in step (6), the link is also showing up in the tree diagram (4)
|
||||
- As we asssigned the `raspberrypi` in the previous (1) Device details page section to the `Internet` parent network node in step (6), the link is also showing up in the tree diagram (4)
|
||||
- We can now assign the device `(AppleTV)` (5) to this `raspberrypi` node, representing a network Switch in this example
|
||||
|
||||
### 3. Network page with 2 levels
|
||||
|
||||
@@ -1,16 +1,27 @@
|
||||
## Documentation overview
|
||||
|
||||
In the app hover over settings or fields/labels or click blue in-app ❔ (question-mark) icons to get to relevant documentation pages.
|
||||
<details>
|
||||
<summary>:information_source: In the app hover over settings or fields/labels or click blue in-app ❔ (question-mark) icons to get to relevant documentation pages.</summary>
|
||||
|
||||

|
||||

|
||||
|
||||
</details>
|
||||
|
||||
There is also an in-app Help / FAQ section that should be answering frequently asked questions.
|
||||
|
||||
### 📥 Installation
|
||||
|
||||
⚠ Only tested as a [docker container - follow these instructions here](https://github.com/jokob-sk/Pi.Alert/blob/main/dockerfiles/README.md).
|
||||
> Check out [leiweibau's fork](https://github.com/leiweibau/Pi.Alert/) if you want to install Pi.Alert on the server directly or original instructions for [pucherot's original code](https://github.com/pucherot/Pi.Alert/)
|
||||
#### 🐳 Docker (Fully supported)
|
||||
|
||||
- The main installation method is as a [docker container - follow these instructions here](https://github.com/jokob-sk/Pi.Alert/blob/main/dockerfiles/README.md).
|
||||
|
||||
#### 💻 Bare-metal / On-server (Experimental/community supported 🧪)
|
||||
|
||||
- [(Experimental 🧪) On-hardware instructions](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/HW_INSTALL.md)
|
||||
|
||||
- Alternative bare-metal install forks:
|
||||
- [leiweibau's fork](https://github.com/leiweibau/Pi.Alert/) (maintained)
|
||||
- [pucherot's original code](https://github.com/pucherot/Pi.Alert/) (un-maintained)
|
||||
|
||||
### 📚 Table of contents
|
||||
|
||||
@@ -18,6 +29,7 @@ There is also an in-app Help / FAQ section that should be answering frequently a
|
||||
|
||||
- [Debugging tips](/docs/DEBUG_TIPS.md)
|
||||
- [Invalid JSON errors debug help](/docs/DEBUG_INVALID_JSON.md)
|
||||
- [Troubleshooting Plugins](/docs/DEBUG_PLUGINS.md)
|
||||
|
||||
#### 🔝 Popular/Suggested
|
||||
|
||||
@@ -31,7 +43,7 @@ There is also an in-app Help / FAQ section that should be answering frequently a
|
||||
|
||||
- [Manage devices (legacy docs)](/docs/DEVICE_MANAGEMENT.md)
|
||||
- [Random MAC/MAC icon meaning (legacy docs)](/docs/RANDOM_MAC.md)
|
||||
- [Custom Icons configuration and support](/docs/ICONS.md)
|
||||
- [Custom Icon configuration and support](/docs/ICONS.md)
|
||||
|
||||
#### 🔎 Examples
|
||||
|
||||
@@ -96,7 +108,7 @@ Suggested test cases:
|
||||
- Blank setup with no DB or config
|
||||
- Existing DB / config
|
||||
- Sending a notification (e. g. Delete a device and wait for a scan to run) and testing all notification gateways, especially:
|
||||
- Email, Apprise (e.g. via Telegram), webhook (e.g. via Discord), MQTT (e.g. via HomeAssitant)
|
||||
- Email, Apprise (e.g. via Telegram), webhook (e.g. via Discord), MQTT (e.g. via Home Assistant)
|
||||
- Saving settings
|
||||
- Test a couple of plugins
|
||||
- Check the Error log for anything unusual
|
||||
@@ -110,7 +122,7 @@ Some additional context:
|
||||
|
||||
Before submitting a new issue please spend a couple of minutes on research:
|
||||
|
||||
* Check [🛑 Common issues](https://github.com/jokob-sk/Pi.Alert/tree/main/dockerfiles#-common-issues)
|
||||
* Check [🛑 Common issues](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/DEBUG_TIPS.md#common-issues)
|
||||
* Check [💡 Closed issues](https://github.com/jokob-sk/Pi.Alert/issues?q=is%3Aissue+is%3Aclosed) if a similar issue was solved in the past.
|
||||
* When submitting an issue ❗[enable debug](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/DEBUG_TIPS.md)❗
|
||||
|
||||
|
||||
@@ -10,7 +10,7 @@ The source of truth for user-defined values is the `pialert.conf` file. Editing
|
||||
|
||||
#### Settings database table
|
||||
|
||||
The `Settings` database table contains settings for App run purposes. The table is recreated every time the App restarts. The settings are loaded from the source-of-truth, that is the `pialert.conf` file. A high-level overview on the databse structure can be found in the [database documentation](/docs/DATABASE.md).
|
||||
The `Settings` database table contains settings for App run purposes. The table is recreated every time the App restarts. The settings are loaded from the source-of-truth, that is the `pialert.conf` file. A high-level overview on the database structure can be found in the [database documentation](/docs/DATABASE.md).
|
||||
|
||||
#### table_settings.json
|
||||
|
||||
|
||||
@@ -4,6 +4,10 @@ You need to specify the network interface and the network mask. You can also con
|
||||
|
||||
## Examples
|
||||
|
||||
> [!NOTE]
|
||||
> Please use the UI to configure settings as that ensures that the config file is in the correct format. Edit `pialert.conf` directly only when really necessary.
|
||||
> 
|
||||
|
||||
* Examples for one and two subnets (❗ Note the `['...', '...']` format):
|
||||
* One subnet: `SCAN_SUBNETS = ['192.168.1.0/24 --interface=eth0']`
|
||||
* Two subnets: `SCAN_SUBNETS = ['192.168.1.0/24 --interface=eth0', '192.168.1.0/24 --interface=eth1 -vlan=107']`
|
||||
|
||||
BIN
docs/img/DEBUG_PLUGINS/plugin_objects_pihole.png
Executable file
BIN
docs/img/DEBUG_PLUGINS/plugin_objects_pihole.png
Executable file
Binary file not shown.
|
After Width: | Height: | Size: 127 KiB |
@@ -644,14 +644,15 @@ if ($ENABLED_DARKMODE === True) {
|
||||
// ------------------------------------------------------------
|
||||
function getDevicesList()
|
||||
{
|
||||
// Read cache
|
||||
devicesList = getCache('devicesList');
|
||||
// Read cache (skip cookie expiry check)
|
||||
devicesList = getCache('devicesListAll_JSON', true);
|
||||
|
||||
if (devicesList != '') {
|
||||
devicesList = JSON.parse (devicesList);
|
||||
} else {
|
||||
devicesList = [];
|
||||
}
|
||||
|
||||
return devicesList;
|
||||
}
|
||||
|
||||
@@ -1283,7 +1284,7 @@ function getDeviceData (readAllData=false) {
|
||||
history.pushState(null, '', newRelativePathQuery);
|
||||
getSessionsPresenceEvents();
|
||||
|
||||
devicesList = getDevicesList();
|
||||
devicesList = getDevicesList();
|
||||
|
||||
$('#txtMAC').val (deviceData['dev_MAC']);
|
||||
$('#txtName').val (deviceData['dev_Name']);
|
||||
@@ -1324,7 +1325,8 @@ function getDeviceData (readAllData=false) {
|
||||
}
|
||||
|
||||
// Check if device is part of the devicesList
|
||||
pos = devicesList.findIndex(item => item.rowid == deviceData['rowid']);
|
||||
pos = devicesList.findIndex(item => item.rowid == deviceData['rowid']);
|
||||
|
||||
if (pos == -1) {
|
||||
devicesList.push({"rowid" : deviceData['rowid'], "mac" : deviceData['dev_MAC'], "name": deviceData['dev_Name'], "type": deviceData['dev_DeviceType']});
|
||||
pos=0;
|
||||
@@ -1398,7 +1400,7 @@ function performSwitch(direction)
|
||||
// get new mac from the devicesList. Don't change to the commented out line below, the mac query string in the URL isn't updated yet!
|
||||
// mac = params.mac;
|
||||
|
||||
mac = devicesList[pos].mac.toString();
|
||||
mac = devicesList[pos].dev_MAC.toString();
|
||||
|
||||
setCache("piaDeviceDetailsMac", mac);
|
||||
|
||||
@@ -1457,6 +1459,9 @@ function setDeviceData (direction='', refreshCallback='') {
|
||||
window.onbeforeunload = null;
|
||||
somethingChanged = false;
|
||||
|
||||
// refresh API
|
||||
updateApi()
|
||||
|
||||
// Callback fuction
|
||||
if (typeof refreshCallback == 'function') {
|
||||
refreshCallback(direction);
|
||||
@@ -1464,6 +1469,25 @@ function setDeviceData (direction='', refreshCallback='') {
|
||||
});
|
||||
}
|
||||
|
||||
// --------------------------------------------------------
|
||||
// Calls a backend function to add a front-end event to an execution queue
|
||||
function updateApi()
|
||||
{
|
||||
|
||||
// value has to be in format event|param. e.g. run|ARPSCAN
|
||||
action = `update_api|devices`
|
||||
|
||||
$.ajax({
|
||||
method: "POST",
|
||||
url: "php/server/util.php",
|
||||
data: { function: "addToExecutionQueue", action: action },
|
||||
success: function(data, textStatus) {
|
||||
console.log(data)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
// -----------------------------------------------------------------------------
|
||||
function askSkipNotifications () {
|
||||
// Check MAC
|
||||
@@ -1630,36 +1654,10 @@ function deleteDevice () {
|
||||
|
||||
// Deactivate controls
|
||||
$('#panDetails :input').attr('disabled', true);
|
||||
|
||||
// refresh API
|
||||
updateApi()
|
||||
}
|
||||
// -----------------------------------------------------------------------------
|
||||
function askDeleteDevice () {
|
||||
// Check MAC
|
||||
if (mac == '') {
|
||||
return;
|
||||
}
|
||||
|
||||
// Ask delete device
|
||||
showModalWarning ('Delete Device', 'Are you sure you want to delete this device?<br>(maybe you prefer to archive it)',
|
||||
'<?= lang('Gen_Cancel');?>', '<?= lang('Gen_Delete');?>', 'deleteDevice');
|
||||
}
|
||||
|
||||
|
||||
// -----------------------------------------------------------------------------
|
||||
function deleteDevice () {
|
||||
// Check MAC
|
||||
if (mac == '') {
|
||||
return;
|
||||
}
|
||||
|
||||
// Delete device
|
||||
$.get('php/server/devices.php?action=deleteDevice&mac='+ mac, function(msg) {
|
||||
showMessage (msg);
|
||||
});
|
||||
|
||||
// Deactivate controls
|
||||
$('#panDetails :input').attr('disabled', true);
|
||||
}
|
||||
|
||||
|
||||
// -----------------------------------------------------------------------------
|
||||
function getSessionsPresenceEvents () {
|
||||
@@ -1812,8 +1810,8 @@ function toggleNetworkConfiguration(disable)
|
||||
|
||||
if(disable)
|
||||
{
|
||||
$('#txtNetworkNodeMac').val(getString('Network_Root_Unconfigurable'));
|
||||
$('#txtNetworkPort').val(getString('Network_Root_Unconfigurable'));
|
||||
// $('#txtNetworkNodeMac').val(getString('Network_Root_Unconfigurable'));
|
||||
// $('#txtNetworkPort').val(getString('Network_Root_Unconfigurable'));
|
||||
$('#txtNetworkPort').prop('readonly', true );
|
||||
$('.parentNetworkNode .input-group-btn').hide();
|
||||
}
|
||||
|
||||
@@ -288,7 +288,12 @@ function main () {
|
||||
initializeDatatable();
|
||||
|
||||
// query data
|
||||
getDevicesTotals();
|
||||
getDevicesTotals();
|
||||
|
||||
// check if dat outdated and show spinner if so
|
||||
handleLoadingDialog()
|
||||
|
||||
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -322,7 +327,7 @@ function filterDataByStatus(data, status) {
|
||||
case 'new':
|
||||
return item.dev_NewDevice === 1;
|
||||
case 'down':
|
||||
return item.dev_PresentLastScan === 0 && item.dev_AlertDeviceDown === 1;
|
||||
return item.dev_PresentLastScan === 0 && item.dev_AlertDeviceDown !== 0;
|
||||
case 'archived':
|
||||
return item.dev_Archived === 1;
|
||||
default:
|
||||
@@ -338,7 +343,7 @@ function getDeviceStatus(item)
|
||||
{
|
||||
return 'On-line';
|
||||
}
|
||||
else if(item.dev_PresentLastScan === 0 && item.dev_AlertDeviceDown === 1)
|
||||
else if(item.dev_PresentLastScan === 0 && item.dev_AlertDeviceDown !== 0)
|
||||
{
|
||||
return 'Down';
|
||||
}
|
||||
@@ -390,7 +395,7 @@ function initializeDatatable (status) {
|
||||
}
|
||||
}
|
||||
|
||||
$.get('api/table_devices.json', function(result) {
|
||||
$.get('api/table_devices.json?nocache=' + Date.now(), function(result) {
|
||||
|
||||
// Filter the data based on deviceStatus
|
||||
var filteredData = filterDataByStatus(result.data, deviceStatus);
|
||||
@@ -641,21 +646,22 @@ function getDevicesTotals () {
|
||||
// -----------------------------------------------------------------------------
|
||||
function handleLoadingDialog()
|
||||
{
|
||||
$.get('api/app_state.json?nocache=' + Date.now(), function(appState) {
|
||||
$.get('log/execution_queue.log?nocache=' + Date.now(), function(data) {
|
||||
|
||||
console.log(appState["showSpinner"])
|
||||
if(appState["showSpinner"])
|
||||
{
|
||||
showSpinner("settings_old")
|
||||
|
||||
if(data.includes("update_api|devices"))
|
||||
{
|
||||
showSpinner("devices_old")
|
||||
|
||||
setTimeout("handleLoadingDialog()", 1000);
|
||||
|
||||
} else
|
||||
} else if ($("#loadingSpinner").is(":visible"))
|
||||
{
|
||||
hideSpinner()
|
||||
hideSpinner();
|
||||
location.reload();
|
||||
}
|
||||
|
||||
})
|
||||
})
|
||||
|
||||
}
|
||||
|
||||
|
||||
@@ -349,6 +349,7 @@ function sanitize(data)
|
||||
// -----------------------------------------------------------------------------
|
||||
function numberArrayFromString(data)
|
||||
{
|
||||
console.log(data)
|
||||
data = JSON.parse(sanitize(data));
|
||||
return data.replace(/\[|\]/g, '').split(',').map(Number);
|
||||
}
|
||||
@@ -514,15 +515,40 @@ function getNameByMacAddress(macAddress) {
|
||||
|
||||
// -----------------------------------------------------------------------------
|
||||
// A function used to make the IP address orderable
|
||||
function isValidIPv6(ipAddress) {
|
||||
// Regular expression for IPv6 validation
|
||||
const ipv6Regex = /^([0-9a-fA-F]{1,4}:){7,7}[0-9a-fA-F]{1,4}$|^([0-9a-fA-F]{1,4}:){1,7}:|^([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}$|^([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}$|^([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}$|^([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}$|^([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}$|^[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})$/;
|
||||
|
||||
return ipv6Regex.test(ipAddress);
|
||||
}
|
||||
|
||||
function formatIPlong(ipAddress) {
|
||||
const parts = ipAddress.split('.');
|
||||
if (parts.length !== 4) {
|
||||
throw new Error('Invalid IP address format');
|
||||
if (ipAddress.includes(':') && isValidIPv6(ipAddress)) {
|
||||
const parts = ipAddress.split(':');
|
||||
|
||||
return parts.reduce((acc, part, index) => {
|
||||
if (part === '') {
|
||||
const remainingGroups = 8 - parts.length + 1;
|
||||
return acc << (16 * remainingGroups);
|
||||
}
|
||||
|
||||
const hexValue = parseInt(part, 16);
|
||||
return acc | (hexValue << (112 - index * 16));
|
||||
}, 0);
|
||||
} else {
|
||||
// Handle IPv4 address
|
||||
const parts = ipAddress.split('.');
|
||||
|
||||
if (parts.length !== 4) {
|
||||
console.log("⚠ Invalid IPv4 address: " + ipAddress);
|
||||
return -1; // or any other default value indicating an error
|
||||
}
|
||||
|
||||
return (parseInt(parts[0]) << 24) |
|
||||
(parseInt(parts[1]) << 16) |
|
||||
(parseInt(parts[2]) << 8) |
|
||||
parseInt(parts[3]);
|
||||
}
|
||||
return (parseInt(parts[0]) << 24) |
|
||||
(parseInt(parts[1]) << 16) |
|
||||
(parseInt(parts[2]) << 8) |
|
||||
parseInt(parts[3]);
|
||||
}
|
||||
|
||||
// -----------------------------------------------------------------------------
|
||||
|
||||
@@ -19,6 +19,25 @@
|
||||
return result;
|
||||
}
|
||||
|
||||
// -------------------------------------------------------------------
|
||||
// Get plugin type base on prefix
|
||||
function getPluginCodeName(pluginsData, prefix)
|
||||
{
|
||||
var result = ""
|
||||
|
||||
pluginsData.forEach((plug) => {
|
||||
|
||||
if (plug.unique_prefix == prefix ) {
|
||||
id = plug.code_name;
|
||||
|
||||
// console.log(id)
|
||||
result = plug.code_name;
|
||||
}
|
||||
});
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
|
||||
// -------------------------------------------------------------------
|
||||
// Get plugin type base on prefix
|
||||
@@ -61,19 +80,22 @@
|
||||
|
||||
});
|
||||
|
||||
html += `
|
||||
|
||||
html += `
|
||||
<div class="col-sm-4 ">
|
||||
<div class="small-box bg-green " >
|
||||
<div class="inner ">
|
||||
<a href="#${prefix}_header" onclick="toggleAllSettings('open')">
|
||||
<h5 class="card-title">
|
||||
${getString(prefix+"_display_name")}
|
||||
<b>${getString(prefix+"_display_name")}</b>
|
||||
</h5>
|
||||
${includeSettings_html}
|
||||
</a>
|
||||
${includeSettings_html}
|
||||
</div>
|
||||
<div class="icon"> ${getString(prefix+"_icon")} </div>
|
||||
|
||||
<a href="#${prefix}_header" onclick="toggleAllSettings('open')">
|
||||
<div class="icon"> ${getString(prefix+"_icon")} </div>
|
||||
</a>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
`
|
||||
});
|
||||
@@ -81,6 +103,41 @@
|
||||
return html;
|
||||
}
|
||||
|
||||
|
||||
// -----------------------------------------------------------------------------
|
||||
// Open or close all settings
|
||||
// -----------------------------------------------------------------------------
|
||||
function toggleAllSettings(openOrClose = '')
|
||||
{
|
||||
inStr = ' in';
|
||||
allOpen = true;
|
||||
openIcon = 'fa-angle-double-down';
|
||||
closeIcon = 'fa-angle-double-up';
|
||||
|
||||
$('.panel-collapse').each(function(){
|
||||
if($(this).attr('class').indexOf(inStr) == -1)
|
||||
{
|
||||
allOpen = false;
|
||||
}
|
||||
})
|
||||
|
||||
if(allOpen == false || openOrClose == 'open')
|
||||
{
|
||||
// open all
|
||||
$('div[data-myid="collapsible"]').each(function(){$(this).attr('class', 'panel-collapse collapse in')})
|
||||
$('div[data-myid="collapsible"]').each(function(){$(this).attr('style', 'height:inherit')})
|
||||
$('#toggleSettings').attr('class', $('#toggleSettings').attr('class').replace(openIcon, closeIcon))
|
||||
|
||||
}
|
||||
else{
|
||||
// close all
|
||||
$('div[data-myid="collapsible"]').each(function(){$(this).attr('class', 'panel-collapse collapse ')})
|
||||
$('#toggleSettings').attr('class', $('#toggleSettings').attr('class').replace(closeIcon, openIcon))
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
||||
// -------------------------------------------------------------------
|
||||
// Checks if all schedules are the same
|
||||
function schedulesAreSynchronized(prefixesOfEnabledPlugins, pluginsData)
|
||||
|
||||
@@ -77,7 +77,7 @@ function getDeviceData() {
|
||||
|
||||
// Device Data
|
||||
$sql = 'SELECT rowid, *,
|
||||
CASE WHEN dev_AlertDeviceDown=1 AND dev_PresentLastScan=0 THEN "Down"
|
||||
CASE WHEN dev_AlertDeviceDown !=0 AND dev_PresentLastScan=0 THEN "Down"
|
||||
WHEN dev_PresentLastScan=1 THEN "On-line"
|
||||
ELSE "Off-line" END as dev_Status
|
||||
FROM Devices
|
||||
@@ -626,7 +626,7 @@ function getDevicesList() {
|
||||
|
||||
$sql = 'SELECT * FROM (
|
||||
SELECT rowid, *, CASE
|
||||
WHEN t1.dev_AlertDeviceDown=1 AND t1.dev_PresentLastScan=0 THEN "Down"
|
||||
WHEN t1.dev_AlertDeviceDown !=0 AND t1.dev_PresentLastScan=0 THEN "Down"
|
||||
WHEN t1.dev_NewDevice=1 THEN "New"
|
||||
WHEN t1.dev_PresentLastScan=1 THEN "On-line"
|
||||
ELSE "Off-line" END AS dev_Status
|
||||
@@ -1133,14 +1133,14 @@ function copyFromDevice() {
|
||||
//------------------------------------------------------------------------------
|
||||
function getDeviceCondition ($deviceStatus) {
|
||||
switch ($deviceStatus) {
|
||||
case 'all': return 'WHERE dev_Archived=0'; break;
|
||||
case 'connected': return 'WHERE dev_Archived=0 AND dev_PresentLastScan=1'; break;
|
||||
case 'favorites': return 'WHERE dev_Archived=0 AND dev_Favorite=1'; break;
|
||||
case 'new': return 'WHERE dev_Archived=0 AND dev_NewDevice=1'; break;
|
||||
case 'down': return 'WHERE dev_Archived=0 AND dev_AlertDeviceDown=1 AND dev_PresentLastScan=0'; break;
|
||||
case 'archived': return 'WHERE dev_Archived=1'; break;
|
||||
default: return 'WHERE 1=0'; break;
|
||||
}
|
||||
case 'all': return 'WHERE dev_Archived=0'; break;
|
||||
case 'connected': return 'WHERE dev_Archived=0 AND dev_PresentLastScan=1'; break;
|
||||
case 'favorites': return 'WHERE dev_Archived=0 AND dev_Favorite=1'; break;
|
||||
case 'new': return 'WHERE dev_Archived=0 AND dev_NewDevice=1'; break;
|
||||
case 'down': return 'WHERE dev_Archived=0 AND dev_AlertDeviceDown !=0 AND dev_PresentLastScan=0'; break;
|
||||
case 'archived': return 'WHERE dev_Archived=1'; break;
|
||||
default: return 'WHERE 1=0'; break;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -558,17 +558,13 @@
|
||||
"PIALERT_WEB_PROTECTION_name": "Login aktivieren",
|
||||
"PIALERT_WEB_PROTECTION_description": "Ein Loginfenster wird angezeigt wenn aktiviert. Untere Beschreibung genau durchlesen falls Sie sich aus Ihrer Instanz aussperren.",
|
||||
"PIALERT_WEB_PASSWORD_name": "Login-Passwort",
|
||||
"PIALERT_WEB_PASSWORD_description": "Das Standardpasswort ist <code>123456</code>. Um das Passwort zu ändern, entweder <code>/home/pi/pialert/back/pialert-cli</code> im Container starten oder <a onclick=\"toggleAllSettings()\" href=\"#SETPWD_RUN\"><code>SETPWD_RUN</code> Set password plugin</a> nutzen.",
|
||||
"INCLUDED_SECTIONS_name": "Benachrichtigungen",
|
||||
"INCLUDED_SECTIONS_description": "Spezifiziert, bei welchen Events Benachrichtigungen versendet werden. Entfernen Sie die Eventtypen, bei welchen Sie nicht benachrichtigt werden wollen. Diese Einstellung überschreibt gerätespezifische Einstellungen im UI. (<code>STRG + klicken</code> zum aus-/abwählen).",
|
||||
"PIALERT_WEB_PASSWORD_description": "Das Standardpasswort ist <code>123456</code>. Um das Passwort zu ändern, entweder <code>/home/pi/pialert/back/pialert-cli</code> im Container starten oder <a onclick=\"toggleAllSettings()\" href=\"#SETPWD_RUN\"><code>SETPWD_RUN</code> Set password plugin</a> nutzen.",
|
||||
"DAYS_TO_KEEP_EVENTS_name": "Lösche Events älter als",
|
||||
"DAYS_TO_KEEP_EVENTS_description": "Dies ist eine Wartungseinstellung. Spezifiziert wie viele Tage Events gespeichert bleiben. Alle älteren Events werden periodisch gelöscht. Wird auch auf die Plugins History angewendet.",
|
||||
"HRS_TO_KEEP_NEWDEV_name": "Neue Geräte speichern für",
|
||||
"HRS_TO_KEEP_NEWDEV_description": "Dies ist eine Wartungseinstellung. Geräte markiert als <b>Neues Gerät</b> werden gelöscht, wenn ihre <b>Erste Sitzung</b> länger her ist als die angegebenen Stunden in dieser Einstellung. <code>0</code> deaktiviert diese Funktion. Nutzen Sie diese Einstellung, um <b>Neue Geräte</b> automatisch nach <code>X</code> Stunden zu löschen.",
|
||||
"REPORT_DASHBOARD_URL_name": "Pi.Alert URL",
|
||||
"REPORT_DASHBOARD_URL_description": "Diese URL wird als Basis fürs Erstellen von Links in E-Mails genutzt. Geben Sie die gesamte URL startend mit <code>http://</code> inklusive der genutzten Portnummer ein (keinen nachfolgenden Schrägstrich <code>/</code> nutzen).",
|
||||
"DIG_GET_IP_ARG_name": "Erkennung externer IP (\"Internet IP\")",
|
||||
"DIG_GET_IP_ARG_description": "Ändere die Argumente des <a href=\"https://linux.die.net/man/1/dig\" target=\"_blank\">dig Dienstprogramms</a>, wenn Probleme beim Auflösen der externen IP auftreten. Argumente werden an das Ende des folgenden Befehls angehängt: <code>dig +short </code>.",
|
||||
"NETWORK_DEVICE_TYPES_name": "Netzwerkgeräte-Typen",
|
||||
"NETWORK_DEVICE_TYPES_description": "Welche Gerätetypen als Netzwerkgeräte in der Netzwerkansicht verwendet werden können. Der Gerätetyp muss genau der <code>Typ</code>-Einstellung eines spezifischen Geräts in den Gerätedetails übereinstimmen. Entfernen Sie keine existierenden Typen, sondern fügen Sie nur neue ein.",
|
||||
"UI_LANG_name": "UI Sprache",
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -470,17 +470,13 @@
|
||||
"PIALERT_WEB_PROTECTION_name" : "Habilitar inicio de sesión",
|
||||
"PIALERT_WEB_PROTECTION_description" : "Cuando está habilitado, se muestra un cuadro de diálogo de inicio de sesión. Lea detenidamente a continuación si se le bloquea el acceso a su instancia.",
|
||||
"PIALERT_WEB_PASSWORD_name" : "Contraseña de inicio de sesión",
|
||||
"PIALERT_WEB_PASSWORD_description" : "La contraseña predeterminada es <code>123456</code>. Para cambiar la contraseña, ejecute <code>/home/pi/pialert/back/pialert-cli</code> en el contenedor",
|
||||
"INCLUDED_SECTIONS_name" : "Notificar en",
|
||||
"INCLUDED_SECTIONS_description" : "Especifica que eventos envían notificaciones. Elimina los tipos de eventos de los que no quieras recibir notificaciones. Este ajuste sobreescribe los ajustes específicos de los dispositivos en la interfaz. (<code>CTRL + Clic</code> para seleccionar / deseleccionar).",
|
||||
"PIALERT_WEB_PASSWORD_description" : "La contraseña predeterminada es <code>123456</code>. Para cambiar la contraseña, ejecute <code>/home/pi/pialert/back/pialert-cli</code> en el contenedor",
|
||||
"DAYS_TO_KEEP_EVENTS_name" : "Eliminar eventos anteriores a",
|
||||
"DAYS_TO_KEEP_EVENTS_description" : "Esta es una configuración de mantenimiento. Esto especifica el número de días de entradas de eventos que se guardarán. Todos los eventos anteriores se eliminarán periódicamente.",
|
||||
"HRS_TO_KEEP_NEWDEV_name": "Guardar nuevos dispositivos para",
|
||||
"HRS_TO_KEEP_NEWDEV_description": "Esta es una configuración de mantenimiento. Si está habilitado (<code>0</code> está deshabilitado), los dispositivos marcados como <b>Nuevo dispositivo</b> se eliminarán si su <b>Primera sesión</ b> el tiempo era anterior a las horas especificadas en esta configuración. Utilice esta configuración si desea eliminar automáticamente <b>Nuevos dispositivos</b> después de <code>X</code> horas.",
|
||||
"REPORT_DASHBOARD_URL_name" : "URL de Pi.Alert",
|
||||
"REPORT_DASHBOARD_URL_description" : "Esta URL se utiliza como base para generar enlaces en los correos electrónicos. Ingrese la URL completa que comienza con <code>http://</code>, incluido el número de puerto (sin barra inclinada al final <code>/</code>).",
|
||||
"DIG_GET_IP_ARG_name" : "Descubrir de IP de Internet",
|
||||
"DIG_GET_IP_ARG_description" : "Cambie los argumentos de la <a href=\"https://linux.die.net/man/1/dig\" target=\"_blank\">utilidad de dig</a> si tiene problemas para resolver su IP de Internet. Los argumentos se agregan al final del siguiente comando: <code>dig +short </code>.",
|
||||
"UI_LANG_name" : "Idioma de interfaz",
|
||||
"UI_LANG_description" : "Seleccione el idioma de interfaz de usuario preferido.",
|
||||
"UI_PRESENCE_name" : "Mostrar en el gráfico de presencia",
|
||||
|
||||
@@ -1,47 +1,41 @@
|
||||
## 📚 Docs for individual plugins
|
||||
> Community translations of this file (might be out-of-date): <a href="https://github.com/jokob-sk/Pi.Alert/blob/main/front/plugins/README_ES.md">Spanish(<img src="https://github.com/lipis/flag-icons/blob/main/flags/4x3/es.svg" alt="README_ES.md" style="height: 16px !important;width: 20px !important;padding-inline:3px !important;">)</a>, <a href="https://github.com/jokob-sk/Pi.Alert/blob/main/front/plugins/README_DE.md">German(<img src="https://github.com/lipis/flag-icons/blob/main/flags/4x3/de.svg" alt="README_DE.md" style="height: 16px !important;width: 20px !important;padding-inline:3px !important;">)</a>
|
||||
|
||||
### 🏴 Community translations of this file
|
||||
# 📚 Docs for individual plugins
|
||||
|
||||
> Please note there might be a delay between English and community translations.
|
||||
>[!NOTE]
|
||||
> Please check this [Plugins debugging guide](https://github.com/jokob-sk/Pi.Alert/blob/main/docs/DEBUG_PLUGINS.md) and the corresponding Plugin documentation in the below table if you are facing issues.
|
||||
|
||||
* <a href="https://github.com/jokob-sk/Pi.Alert/blob/main/front/plugins/README_ES.md">
|
||||
<img src="https://github.com/lipis/flag-icons/blob/main/flags/4x3/es.svg" alt="README_ES.md" style="height: 20px !important;width: 20px !important;"> Spanish (Spain)
|
||||
</a>
|
||||
## 🔌 Plugins & 📚 Docs
|
||||
|
||||
* <a href="https://github.com/jokob-sk/Pi.Alert/blob/main/front/plugins/README_DE.md">
|
||||
<img src="https://github.com/lipis/flag-icons/blob/main/flags/4x3/de.svg" alt="README_DE.md" style="height: 20px !important;width: 20px !important;"> German (Germany)
|
||||
</a>
|
||||
|
||||
### 🔌 Plugins & 📚 Docs
|
||||
|
||||
| Required | CurrentScan | Unique Prefix | Data source | Type | Link + Docs |
|
||||
|----------|-------------|---------------|--------------------|----------------|------------------------------------------------------------------|
|
||||
| | | APPRISE | Script | 💬 publisher | 📚[_publisher_apprise](/front/plugins/_publisher_apprise/) |
|
||||
| | Yes | ARPSCAN | Script | 🔍dev scanner | 📚[arp_scan](/front/plugins/arp_scan/) |
|
||||
| | | CSVBCKP | Script | ⚙ system | 📚[csv_backup](/front/plugins/csv_backup/) |
|
||||
| Yes* | | DBCLNP | Script | ⚙ system | 📚[db_cleanup](/front/plugins/db_cleanup/) |
|
||||
| | | DDNS | Script | ⚙ system | 📚[ddns_update](/front/plugins/ddns_update/) |
|
||||
| | Yes | DHCPLSS | Script | 🔍dev scanner | 📚[dhcp_leases](/front/plugins/dhcp_leases/) |
|
||||
| | | DHCPSRVS | Script | ♻ other | 📚[dhcp_servers](/front/plugins/dhcp_servers/) |
|
||||
| | Yes | INTRNT | Script | 🔍dev scanner | 📚[internet_ip](/front/plugins/internet_ip/) |
|
||||
| | | INTRSPD | Script | ♻ other | 📚[internet_speedtest](/front/plugins/internet_speedtest/) |
|
||||
| | | MAINT | Script | ⚙ system | 📚[maintenance](/front/plugins/maintenance/) |
|
||||
| | | MQTT | Script | 💬 publisher | 📚[_publisher_mqtt](/front/plugins/_publisher_mqtt/) |
|
||||
| Yes | | NEWDEV | Template | ⚙ system | 📚[newdev_template](/front/plugins/newdev_template/) |
|
||||
| | | NMAP | Script | ♻ other | 📚[nmap_scan](/front/plugins/nmap_scan/) |
|
||||
| | | NTFY | Script | 💬 publisher | 📚[_publisher_ntfy](/front/plugins/_publisher_ntfy/) |
|
||||
| | | PHOLUS | Script | ♻ other | 📚[pholus_scan](/front/plugins/pholus_scan/) |
|
||||
| | Yes | PIHOLE | External SQLite DB | 🔍dev scanner | 📚[pihole_scan](/front/plugins/pihole_scan/) |
|
||||
| | | PUSHSAFER | Script | 💬 publisher | 📚[_publisher_pushsafer](/front/plugins/_publisher_pushsafer/) |
|
||||
| | | SETPWD | Script | ⚙ system | 📚[set_password](/front/plugins/set_password/) |
|
||||
| | | SMTP | Script | 💬 publisher | 📚[_publisher_email](/front/plugins/_publisher_email/) |
|
||||
| | Yes | SNMPDSC | Script | 🔍dev scanner | 📚[snmp_discovery](/front/plugins/snmp_discovery/) |
|
||||
| | Yes** | UNDIS | Script | ♻ other | 📚[undiscoverables](/front/plugins/undiscoverables/) |
|
||||
| | Yes | UNFIMP | Script | 🔍dev scanner | 📚[unifi_import](/front/plugins/unifi_import/) |
|
||||
| | | VNDRPDT | Script | ⚙ system | 📚[vendor_update](/front/plugins/vendor_update/) |
|
||||
| | | WEBHOOK | Script | 💬 publisher | 📚[_publisher_webhook](/front/plugins/_publisher_webhook/) |
|
||||
| | | WEBMON | Script | ♻ other | 📚[website_monitor](/front/plugins/website_monitor/) |
|
||||
| N/A | | N/A | SQL query | | N/A, but the External SQLite DB plugins work similar |
|
||||
| Required | CurrentScan | Unique Prefix | Data source | Type | Link + Docs |
|
||||
|----------|-------------|---------------|--------------------|----------------|---------------------------------------------------------------------|
|
||||
| | | APPRISE | Script | 💬 publisher | 📚[_publisher_apprise](/front/plugins/_publisher_apprise/) |
|
||||
| | Yes | ARPSCAN | Script | 🔍dev scanner | 📚[arp_scan](/front/plugins/arp_scan/) |
|
||||
| | | CSVBCKP | Script | ⚙ system | 📚[csv_backup](/front/plugins/csv_backup/) |
|
||||
| Yes* | | DBCLNP | Script | ⚙ system | 📚[db_cleanup](/front/plugins/db_cleanup/) |
|
||||
| | | DDNS | Script | ⚙ system | 📚[ddns_update](/front/plugins/ddns_update/) |
|
||||
| | Yes | DHCPLSS | Script | 🔍dev scanner | 📚[dhcp_leases](/front/plugins/dhcp_leases/) |
|
||||
| | | DHCPSRVS | Script | ♻ other | 📚[dhcp_servers](/front/plugins/dhcp_servers/) |
|
||||
| | Yes | INTRNT | Script | 🔍dev scanner | 📚[internet_ip](/front/plugins/internet_ip/) |
|
||||
| | | INTRSPD | Script | ♻ other | 📚[internet_speedtest](/front/plugins/internet_speedtest/) |
|
||||
| | | MAINT | Script | ⚙ system | 📚[maintenance](/front/plugins/maintenance/) |
|
||||
| | | MQTT | Script | 💬 publisher | 📚[_publisher_mqtt](/front/plugins/_publisher_mqtt/) |
|
||||
| Yes | | NEWDEV | Template | ⚙ system | 📚[newdev_template](/front/plugins/newdev_template/) |
|
||||
| | | NMAP | Script | ♻ other | 📚[nmap_scan](/front/plugins/nmap_scan/) |
|
||||
| Yes | | NTFPRCS | Template | ⚙ system | 📚[notification_processing](/front/plugins/notification_processing/)|
|
||||
| | | NTFY | Script | 💬 publisher | 📚[_publisher_ntfy](/front/plugins/_publisher_ntfy/) |
|
||||
| | | PHOLUS | Script | ♻ other | 📚[pholus_scan](/front/plugins/pholus_scan/) |
|
||||
| | Yes | PIHOLE | External SQLite DB | 🔍dev scanner | 📚[pihole_scan](/front/plugins/pihole_scan/) |
|
||||
| | | PUSHSAFER | Script | 💬 publisher | 📚[_publisher_pushsafer](/front/plugins/_publisher_pushsafer/) |
|
||||
| | | SETPWD | Script | ⚙ system | 📚[set_password](/front/plugins/set_password/) |
|
||||
| | | SMTP | Script | 💬 publisher | 📚[_publisher_email](/front/plugins/_publisher_email/) |
|
||||
| | Yes | SNMPDSC | Script | 🔍dev scanner | 📚[snmp_discovery](/front/plugins/snmp_discovery/) |
|
||||
| | Yes** | UNDIS | Script | ♻ other | 📚[undiscoverables](/front/plugins/undiscoverables/) |
|
||||
| | Yes | UNFIMP | Script | 🔍dev scanner | 📚[unifi_import](/front/plugins/unifi_import/) |
|
||||
| | | VNDRPDT | Script | ⚙ system | 📚[vendor_update](/front/plugins/vendor_update/) |
|
||||
| | | WEBHOOK | Script | 💬 publisher | 📚[_publisher_webhook](/front/plugins/_publisher_webhook/) |
|
||||
| | | WEBMON | Script | ♻ other | 📚[website_monitor](/front/plugins/website_monitor/) |
|
||||
| N/A | | N/A | SQL query | | N/A, but the External SQLite DB plugins work similarly |
|
||||
|
||||
|
||||
> \* The database cleanup plugin (`DBCLNP`) is not _required_ but the app will become unusable after a while if not executed.
|
||||
@@ -400,7 +394,7 @@ Plugin results are always inserted into the standard `Plugin_Objects` database t
|
||||
>3. That's it. PiAlert takes care of the rest. It loops thru the objects discovered by the plugin, takes the results line, by line and inserts them into the database table specified in `"mapped_to_table"`. The columns are translated from the generic plugin columns to the target table via the `"mapped_to_column"` property in the column definitions.
|
||||
|
||||
> [!NOTE]
|
||||
> You can create a column mapping with a default value via the `mapped_to_column_data` property. This means that the value of the given column will always be this value. Taht also menas that the `"column": "NameDoesntMatter"` is not important as there is no databse source column.
|
||||
> You can create a column mapping with a default value via the `mapped_to_column_data` property. This means that the value of the given column will always be this value. Taht also menas that the `"column": "NameDoesntMatter"` is not important as there is no database source column.
|
||||
|
||||
|
||||
>🔍 Example:
|
||||
|
||||
@@ -381,7 +381,7 @@ Plugin results are always inserted into the standard `Plugin_Objects` database t
|
||||
>3. That's it. PiAlert takes care of the rest. It loops thru the objects discovered by the plugin, takes the results line, by line and inserts them into the database table specified in `"mapped_to_table"`. The columns are translated from the generic plugin columns to the target table via the `"mapped_to_column"` property in the column definitions.
|
||||
|
||||
> [!NOTE]
|
||||
> You can create a column mapping with a default value via the `mapped_to_column_data` property. This means that the value of the given column will always be this value. Taht also menas that the `"column": "NameDoesntMatter"` is not important as there is no databse source column.
|
||||
> You can create a column mapping with a default value via the `mapped_to_column_data` property. This means that the value of the given column will always be this value. Taht also menas that the `"column": "NameDoesntMatter"` is not important as there is no database source column.
|
||||
|
||||
|
||||
>🔍 Beispiel:
|
||||
|
||||
@@ -109,12 +109,12 @@ def send(pHTML, pText):
|
||||
|
||||
if get_setting_value("LOG_LEVEL") == 'debug':
|
||||
|
||||
send_email(msg)
|
||||
send_email(msg,smtp_timeout)
|
||||
|
||||
else:
|
||||
|
||||
try:
|
||||
send_email(msg)
|
||||
send_email(msg,smtp_timeout)
|
||||
|
||||
except smtplib.SMTPAuthenticationError as e:
|
||||
mylog('none', [' ERROR: Couldn\'t connect to the SMTP server (SMTPAuthenticationError)'])
|
||||
@@ -132,7 +132,7 @@ def send(pHTML, pText):
|
||||
mylog('none', [' ERROR: ', str(e)])
|
||||
|
||||
# ----------------------------------------------------------------------------------
|
||||
def send_email(msg):
|
||||
def send_email(msg,smtp_timeout):
|
||||
# Send mail
|
||||
if get_setting_value('SMTP_FORCE_SSL'):
|
||||
mylog('debug', ['SMTP_FORCE_SSL == True so using .SMTP_SSL()'])
|
||||
|
||||
@@ -254,7 +254,7 @@
|
||||
"events": ["test"],
|
||||
"type": "text.select",
|
||||
"default_value":"disabled",
|
||||
"options": ["disabled", "on_notification" ],
|
||||
"options": ["disabled", "on_notification", "once", "schedule", "always_after_scan", "on_new_device" ],
|
||||
"localized": ["name", "description"],
|
||||
"name" :[{
|
||||
"language_code": "en_us",
|
||||
@@ -267,7 +267,7 @@
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string" : "Enable sending notifications via <a target=\"_blank\" href=\"https://www.home-assistant.io/integrations/mqtt/\">MQTT</a> to your Home Assistance instance."
|
||||
"string" : "Enable sending notifications via <a target=\"_blank\" href=\"https://www.home-assistant.io/integrations/mqtt/\">MQTT</a> to your Home Assistance instance. Usually, <code>on_notification</code> is recommended. See the <a target=\"_blank\" href=\"https://github.com/jokob-sk/Pi.Alert/blob/main/docs/HOME_ASSISTANT.md\">PiAlert Home Assistant guide</a> for details."
|
||||
},
|
||||
{
|
||||
"language_code": "es_es",
|
||||
@@ -298,6 +298,37 @@
|
||||
"string" : "Comando a ejecutar"
|
||||
}]
|
||||
},
|
||||
{
|
||||
"function": "RUN_SCHD",
|
||||
"type": "text",
|
||||
"default_value":"0 2 * * 3",
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
"name" : [{
|
||||
"language_code":"en_us",
|
||||
"string" : "Schedule"
|
||||
},
|
||||
{
|
||||
"language_code":"es_es",
|
||||
"string" : "Schedule"
|
||||
},
|
||||
{
|
||||
"language_code":"de_de",
|
||||
"string" : "Schedule"
|
||||
}],
|
||||
"description": [{
|
||||
"language_code":"en_us",
|
||||
"string" : "Only enabled if you select <code>schedule</code> in the <a href=\"#MQTT_RUN\"><code>MQTT_RUN</code> setting</a>. Make sure you enter the schedule in the correct cron-like format (e.g. validate at <a href=\"https://crontab.guru/\" target=\"_blank\">crontab.guru</a>). For example entering <code>0 4 * * *</code> will run the scan after 4 am in the <a onclick=\"toggleAllSettings()\" href=\"#TIMEZONE\"><code>TIMEZONE</code> you set above</a>. Will be run NEXT time the time passes."
|
||||
},
|
||||
{
|
||||
"language_code":"es_es",
|
||||
"string" : "Solo está habilitado si selecciona <code>schedule</code> en la configuración <a href=\"#MQTT_RUN\"><code>MQTT_RUN</code></a>. Asegúrese de ingresar la programación en el formato similar a cron correcto (por ejemplo, valide en <a href=\"https://crontab.guru/\" target=\"_blank\">crontab.guru</a>). Por ejemplo, ingresar <code>0 4 * * *</code> ejecutará el escaneo después de las 4 a.m. en el <a onclick=\"toggleAllSettings()\" href=\"#TIMEZONE\"><code>TIMEZONE</ código> que configuró arriba</a>. Se ejecutará la PRÓXIMA vez que pase el tiempo."
|
||||
},
|
||||
{
|
||||
"language_code":"de_de",
|
||||
"string" : "Nur aktiviert, wenn Sie <code>schedule</code> in der <a href=\"#MQTT_RUN\"><code>MQTT_RUN</code>-Einstellung</a> auswählen. Stellen Sie sicher, dass Sie den Zeitplan im richtigen Cron-ähnlichen Format eingeben (z. B. validieren unter <a href=\"https://crontab.guru/\" target=\"_blank\">crontab.guru</a>). Wenn Sie beispielsweise <code>0 4 * * *</code> eingeben, wird der Scan nach 4 Uhr morgens in der <a onclick=\"toggleAllSettings()\" href=\"#TIMEZONE\"><code>TIMEZONE</ ausgeführt. Code> den Sie oben festgelegt haben</a>. Wird das NÄCHSTE Mal ausgeführt, wenn die Zeit vergeht."
|
||||
}]
|
||||
},
|
||||
{
|
||||
"function": "RUN_TIMEOUT",
|
||||
"type": "integer",
|
||||
@@ -462,6 +493,40 @@
|
||||
"language_code": "es_es",
|
||||
"string" : "Un pequeño truco: retrase la adición a la cola en caso de que el proceso se reinicie y los procesos de publicación anteriores se anulen (se necesitan ~<code>2</code>s para actualizar la configuración de un sensor en el intermediario). Probado con <code>2</code>-<code>3</code> segundos de retraso. Este retraso solo se aplica cuando se crean dispositivos (durante el primer bucle de notificación). No afecta los escaneos o notificaciones posteriores."
|
||||
}]
|
||||
},
|
||||
{
|
||||
"function": "SEND_STATS",
|
||||
"type": "boolean",
|
||||
"default_value":true,
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
"name" : [{
|
||||
"language_code":"en_us",
|
||||
"string" : "Send stats"
|
||||
}
|
||||
],
|
||||
"description": [{
|
||||
"language_code":"en_us",
|
||||
"string" : "Check to send overal device stats, such as number of Online and Offline devices."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"function": "SEND_DEVICES",
|
||||
"type": "boolean",
|
||||
"default_value":true,
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
"name" : [{
|
||||
"language_code":"en_us",
|
||||
"string" : "Send devices"
|
||||
}
|
||||
],
|
||||
"description": [{
|
||||
"language_code":"en_us",
|
||||
"string" : "Check to send individual devices to the broker with details, such as <code>is_new</code>, <code>is_present</code>, or <code>mac_address</code> of the devices."
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@@ -132,6 +132,12 @@ class sensor_config:
|
||||
|
||||
def publish_mqtt(client, topic, message):
|
||||
status = 1
|
||||
|
||||
|
||||
|
||||
mylog('verbose', [f"[{pluginName}] Sending MQTT topic: {topic}"])
|
||||
mylog('verbose', [f"[{pluginName}] Sending MQTT message: {message}"])
|
||||
|
||||
while status != 0:
|
||||
result = client.publish(
|
||||
topic=topic,
|
||||
@@ -183,7 +189,7 @@ def publish_sensor(client, sensorConfig):
|
||||
global mqtt_sensors
|
||||
|
||||
message = '{ \
|
||||
"name":"'+ sensorConfig.deviceName +' '+sensorConfig.sensorName+'", \
|
||||
"name":"'+sensorConfig.sensorName+'", \
|
||||
"state_topic":"system-sensors/'+sensorConfig.sensorType+'/'+sensorConfig.deviceId+'/state", \
|
||||
"value_template":"{{value_json.'+sensorConfig.sensorName+'}}", \
|
||||
"unique_id":"'+sensorConfig.deviceId+'_sensor_'+sensorConfig.sensorName+'", \
|
||||
@@ -251,76 +257,80 @@ def mqtt_start(db):
|
||||
# General stats
|
||||
|
||||
# Create a generic device for overal stats
|
||||
create_generic_device(client)
|
||||
if get_setting_value('MQTT_SEND_STATS') == True:
|
||||
# Create a new device representing overall PiAlert stats
|
||||
create_generic_device(client)
|
||||
|
||||
# Get the data
|
||||
row = get_device_stats(db)
|
||||
# Get the data
|
||||
row = get_device_stats(db)
|
||||
|
||||
columns = ["online","down","all","archived","new","unknown"]
|
||||
columns = ["online","down","all","archived","new","unknown"]
|
||||
|
||||
payload = ""
|
||||
payload = ""
|
||||
|
||||
# Update the values
|
||||
for column in columns:
|
||||
payload += '"'+column+'": ' + str(row[column]) +','
|
||||
# Update the values
|
||||
for column in columns:
|
||||
payload += '"'+column+'": ' + str(row[column]) +','
|
||||
|
||||
# Publish (warap into {} and remove last ',' from above)
|
||||
publish_mqtt(client, "system-sensors/sensor/pialert/state",
|
||||
'{ \
|
||||
'+ payload[:-1] +'\
|
||||
}'
|
||||
)
|
||||
# Publish (wrap into {} and remove last ',' from above)
|
||||
publish_mqtt(client, "system-sensors/sensor/pialert/state",
|
||||
'{ \
|
||||
'+ payload[:-1] +'\
|
||||
}'
|
||||
)
|
||||
|
||||
# Generate device-specific MQTT messages if enabled
|
||||
if get_setting_value('MQTT_SEND_DEVICES') == True:
|
||||
|
||||
# Specific devices
|
||||
# Specific devices
|
||||
|
||||
# Get all devices
|
||||
devices = get_all_devices(db)
|
||||
# Get all devices
|
||||
devices = get_all_devices(db)
|
||||
|
||||
sec_delay = len(devices) * int(get_setting_value('MQTT_DELAY_SEC'))*5
|
||||
sec_delay = len(devices) * int(get_setting_value('MQTT_DELAY_SEC'))*5
|
||||
|
||||
mylog('minimal', [f"[{pluginName}] Estimated delay: ", (sec_delay), 's ', '(', round(sec_delay/60,1) , 'min)' ])
|
||||
mylog('minimal', [f"[{pluginName}] Estimated delay: ", (sec_delay), 's ', '(', round(sec_delay/60,1) , 'min)' ])
|
||||
|
||||
|
||||
for device in devices:
|
||||
|
||||
for device in devices:
|
||||
|
||||
|
||||
# Create devices in Home Assistant - send config messages
|
||||
deviceId = 'mac_' + device["dev_MAC"].replace(" ", "").replace(":", "_").lower()
|
||||
deviceNameDisplay = re.sub('[^a-zA-Z0-9-_\s]', '', device["dev_Name"])
|
||||
|
||||
# Create devices in Home Assistant - send config messages
|
||||
deviceId = 'mac_' + device["dev_MAC"].replace(" ", "").replace(":", "_").lower()
|
||||
deviceNameDisplay = re.sub('[^a-zA-Z0-9-_\s]', '', device["dev_Name"])
|
||||
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'sensor', 'last_ip', 'ip-network', device["dev_MAC"])
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'binary_sensor', 'is_present', 'wifi', device["dev_MAC"])
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'sensor', 'mac_address', 'folder-key-network', device["dev_MAC"])
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'sensor', 'is_new', 'bell-alert-outline', device["dev_MAC"])
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'sensor', 'vendor', 'cog', device["dev_MAC"])
|
||||
|
||||
# update device sensors in home assistant
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'sensor', 'last_ip', 'ip-network', device["dev_MAC"])
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'binary_sensor', 'is_present', 'wifi', device["dev_MAC"])
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'sensor', 'mac_address', 'folder-key-network', device["dev_MAC"])
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'sensor', 'is_new', 'bell-alert-outline', device["dev_MAC"])
|
||||
create_sensor(client, deviceId, deviceNameDisplay, 'sensor', 'vendor', 'cog', device["dev_MAC"])
|
||||
|
||||
# update device sensors in home assistant
|
||||
|
||||
publish_mqtt(client, 'system-sensors/sensor/'+deviceId+'/state',
|
||||
'{ \
|
||||
"last_ip": "' + device["dev_LastIP"] +'", \
|
||||
"is_new": "' + str(device["dev_NewDevice"]) +'", \
|
||||
"vendor": "' + sanitize_string(device["dev_Vendor"]) +'", \
|
||||
"mac_address": "' + str(device["dev_MAC"]) +'" \
|
||||
}'
|
||||
)
|
||||
publish_mqtt(client, 'system-sensors/sensor/'+deviceId+'/state',
|
||||
'{ \
|
||||
"last_ip": "' + device["dev_LastIP"] +'", \
|
||||
"is_new": "' + str(device["dev_NewDevice"]) +'", \
|
||||
"vendor": "' + sanitize_string(device["dev_Vendor"]) +'", \
|
||||
"mac_address": "' + str(device["dev_MAC"]) +'" \
|
||||
}'
|
||||
)
|
||||
|
||||
publish_mqtt(client, 'system-sensors/binary_sensor/'+deviceId+'/state',
|
||||
'{ \
|
||||
"is_present": "' + to_binary_sensor(str(device["dev_PresentLastScan"])) +'"\
|
||||
}'
|
||||
)
|
||||
publish_mqtt(client, 'system-sensors/binary_sensor/'+deviceId+'/state',
|
||||
'{ \
|
||||
"is_present": "' + to_binary_sensor(str(device["dev_PresentLastScan"])) +'"\
|
||||
}'
|
||||
)
|
||||
|
||||
# delete device / topic
|
||||
# homeassistant/sensor/mac_44_ef_bf_c4_b1_af/is_present/config
|
||||
# client.publish(
|
||||
# topic="homeassistant/sensor/"+deviceId+"/is_present/config",
|
||||
# payload="",
|
||||
# qos=1,
|
||||
# retain=True,
|
||||
# )
|
||||
# time.sleep(10)
|
||||
# delete device / topic
|
||||
# homeassistant/sensor/mac_44_ef_bf_c4_b1_af/is_present/config
|
||||
# client.publish(
|
||||
# topic="homeassistant/sensor/"+deviceId+"/is_present/config",
|
||||
# payload="",
|
||||
# qos=1,
|
||||
# retain=True,
|
||||
# )
|
||||
# time.sleep(10)
|
||||
|
||||
|
||||
#===============================================================================
|
||||
|
||||
@@ -12,3 +12,9 @@ Arp-scan is a command-line tool that uses the ARP protocol to discover and finge
|
||||
- SAVE
|
||||
- Wait for the next scan to finish
|
||||
|
||||
#### Examples
|
||||
|
||||
Settings:
|
||||
|
||||

|
||||
|
||||
|
||||
BIN
front/plugins/arp_scan/arp-scan-settings.png
Executable file
BIN
front/plugins/arp_scan/arp-scan-settings.png
Executable file
Binary file not shown.
|
After Width: | Height: | Size: 178 KiB |
@@ -106,7 +106,7 @@
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Specify when your Network-discovery scan will run. Typical setting would be <code>schedule</code> and then you specify a cron-like schedule in the <a href=\"#ARPSCAN_RUN_SCHD\"><code>ARPSCAN_RUN_SCHD</code>setting</a> "
|
||||
"string": "Specify when your Network-discovery scan will run. Typical setting would be <code>schedule</code> and then you specify a cron-like schedule in the <a href=\"#ARPSCAN_RUN_SCHD\"><code>ARPSCAN_RUN_SCHD</code>setting</a>. ⚠ Use the same schedule if you have multiple <i class=\"fa-solid fa-magnifying-glass-plus\"></i> Device scanners enabled."
|
||||
},
|
||||
{
|
||||
"language_code": "es_es",
|
||||
|
||||
@@ -16,4 +16,4 @@ Plugin generating CSV backups of your Devices database table, including the netw
|
||||
|
||||
### Usage
|
||||
|
||||
- If the devices.csv file can be overwritten or the date and time timestamp added to the name. This is toggled with the `CSVBCKP_overwrite` setting.
|
||||
- The `devices.csv` file can be overwritten or the date and time timestamp added to the name. This is toggled with the `CSVBCKP_overwrite` setting.
|
||||
|
||||
@@ -18,7 +18,7 @@ sys.path.append('/home/pi/pialert/pialert')
|
||||
from plugin_helper import Plugin_Object, Plugin_Objects, decodeBase64
|
||||
from logger import mylog, append_line_to_file
|
||||
from helper import timeNowTZ
|
||||
from const import logPath, pialertPath
|
||||
from const import logPath, pialertPath, fullDbPath
|
||||
|
||||
|
||||
CUR_PATH = str(pathlib.Path(__file__).parent.resolve())
|
||||
@@ -43,7 +43,7 @@ def main():
|
||||
mylog('verbose', ['[CSVBCKP] In script'])
|
||||
|
||||
# Connect to the PiAlert SQLite database
|
||||
conn = sqlite3.connect('/home/pi/pialert/db/pialert.db')
|
||||
conn = sqlite3.connect(fullDbPath)
|
||||
cursor = conn.cursor()
|
||||
|
||||
# Execute your SQL query
|
||||
|
||||
@@ -172,6 +172,25 @@
|
||||
"string": "Maximale Zeit in Sekunden, die auf den Abschluss des Skripts gewartet werden soll. Bei Überschreitung dieser Zeit wird das Skript abgebrochen."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"function": "NOTIFI_HIST",
|
||||
"type": "integer",
|
||||
"default_value": 100,
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
"name": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Notifications History"
|
||||
}
|
||||
],
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "How many historical entries of Notifications should be kept. This influences how mane entries are also available in the Report section in the UI"
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
|
||||
|
||||
@@ -18,7 +18,7 @@ sys.path.append('/home/pi/pialert/pialert')
|
||||
from plugin_helper import Plugin_Object, Plugin_Objects, decodeBase64
|
||||
from logger import mylog, append_line_to_file
|
||||
from helper import timeNowTZ, get_setting_value
|
||||
from const import logPath, pialertPath
|
||||
from const import logPath, pialertPath, fullDbPath
|
||||
|
||||
|
||||
CUR_PATH = str(pathlib.Path(__file__).parent.resolve())
|
||||
@@ -44,7 +44,7 @@ def main():
|
||||
|
||||
|
||||
# Execute cleanup/upkeep
|
||||
cleanup_database('/home/pi/pialert/db/pialert.db', DAYS_TO_KEEP_EVENTS, PHOLUS_DAYS_DATA, HRS_TO_KEEP_NEWDEV, PLUGINS_KEEP_HIST)
|
||||
cleanup_database(fullDbPath, DAYS_TO_KEEP_EVENTS, PHOLUS_DAYS_DATA, HRS_TO_KEEP_NEWDEV, PLUGINS_KEEP_HIST)
|
||||
|
||||
mylog('verbose', ['[DBCLNP] Cleanup complete file '])
|
||||
|
||||
|
||||
@@ -478,7 +478,7 @@
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Enable import of devices from <code>dhcp.leases</code> files. If you select <code>schedule</code> the scheduling settings from below are applied. If you select <code>once</code> the scan is run only once on start of the application (container) or after you update your settings."
|
||||
"string": "Enable import of devices from <code>dhcp.leases</code> files. If you select <code>schedule</code> the scheduling settings from below are applied. If you select <code>once</code> the scan is run only once on start of the application (container) or after you update your settings. ⚠ Use the same schedule if you have multiple <i class=\"fa-solid fa-magnifying-glass-plus\"></i> Device scanners enabled."
|
||||
},
|
||||
{
|
||||
"language_code": "es_es",
|
||||
|
||||
@@ -11,7 +11,7 @@ import chardet
|
||||
sys.path.append("/home/pi/pialert/front/plugins")
|
||||
sys.path.append('/home/pi/pialert/pialert')
|
||||
|
||||
from plugin_helper import Plugin_Object, Plugin_Objects, handleEmpty
|
||||
from plugin_helper import Plugin_Object, Plugin_Objects, handleEmpty, is_mac
|
||||
from logger import mylog
|
||||
from dhcp_leases import DhcpLeases
|
||||
|
||||
@@ -76,16 +76,20 @@ def get_entries(path, plugin_objects):
|
||||
leases = DhcpLeases(path)
|
||||
leasesList = leases.get()
|
||||
for lease in leasesList:
|
||||
plugin_objects.add_object(
|
||||
primaryId = handleEmpty(lease.ethernet),
|
||||
secondaryId = handleEmpty(lease.ip),
|
||||
watched1 = handleEmpty(lease.active),
|
||||
watched2 = handleEmpty(lease.hostname),
|
||||
watched3 = handleEmpty(lease.hardware),
|
||||
watched4 = handleEmpty(lease.binding_state),
|
||||
extra = handleEmpty(path),
|
||||
foreignKey = handleEmpty(lease.ethernet)
|
||||
)
|
||||
|
||||
# filter out irrelevant entries (e.g. from OPNsense dhcp.leases files)
|
||||
if is_mac(lease.ethernet):
|
||||
|
||||
plugin_objects.add_object(
|
||||
primaryId = handleEmpty(lease.ethernet),
|
||||
secondaryId = handleEmpty(lease.ip),
|
||||
watched1 = handleEmpty(lease.active),
|
||||
watched2 = handleEmpty(lease.hostname),
|
||||
watched3 = handleEmpty(lease.hardware),
|
||||
watched4 = handleEmpty(lease.binding_state),
|
||||
extra = handleEmpty(path),
|
||||
foreignKey = handleEmpty(lease.ethernet)
|
||||
)
|
||||
return plugin_objects
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
@@ -290,7 +290,7 @@
|
||||
}],
|
||||
"description": [{
|
||||
"language_code":"en_us",
|
||||
"string" : "Enable a regular scan of rogue DHCP servers. If you select <code>schedule</code> the scheduling settings from below are applied. If you select <code>once</code> the scan is run only once on start of the application (container) or after you update your settings."
|
||||
"string" : "Enable a regular scan of rogue DHCP servers. If you select <code>schedule</code> the scheduling settings from below are applied. If you select <code>once</code> the scan is run only once on start of the application (container) or after you update your settings. ⚠ Use the same schedule if you have multiple <i class=\"fa-solid fa-magnifying-glass-plus\"></i> Device scanners enabled."
|
||||
},
|
||||
{
|
||||
"language_code":"es_es",
|
||||
|
||||
@@ -57,9 +57,9 @@
|
||||
"value": "SELECT dev_LastIP FROM Devices WHERE dev_MAC = 'Internet' "
|
||||
},
|
||||
{
|
||||
"name": "DIG_GET_IP_ARG",
|
||||
"name": "INTRNT_DIG_GET_IP_ARG",
|
||||
"type": "setting",
|
||||
"value": "DIG_GET_IP_ARG",
|
||||
"value": "INTRNT_DIG_GET_IP_ARG",
|
||||
"base64": true
|
||||
}
|
||||
],
|
||||
@@ -109,7 +109,7 @@
|
||||
{
|
||||
"function": "CMD",
|
||||
"type": "readonly",
|
||||
"default_value": "python3 /home/pi/pialert/front/plugins/internet_ip/script.py prev_ip={prev_ip} DIG_GET_IP_ARG={DIG_GET_IP_ARG}",
|
||||
"default_value": "python3 /home/pi/pialert/front/plugins/internet_ip/script.py prev_ip={prev_ip} INTRNT_DIG_GET_IP_ARG={INTRNT_DIG_GET_IP_ARG}",
|
||||
"options": [],
|
||||
"localized": [
|
||||
"name",
|
||||
@@ -144,6 +144,44 @@
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"function": "DIG_GET_IP_ARG",
|
||||
"type": "text",
|
||||
"default_value": "-4 myip.opendns.com @resolver1.opendns.com",
|
||||
"options": [],
|
||||
"localized": [
|
||||
"name",
|
||||
"description"
|
||||
],
|
||||
"name": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Internet IP discovery"
|
||||
},
|
||||
{
|
||||
"language_code": "es_es",
|
||||
"string": "Descubrir de IP de Internet"
|
||||
},
|
||||
{
|
||||
"language_code": "de_de",
|
||||
"string": "Erkennung externer IP (\"Internet IP\")"
|
||||
}
|
||||
],
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Change the <a href=\"https://linux.die.net/man/1/dig\" target=\"_blank\">dig utility</a> arguments if you have issues resolving your Internet IP. Arguments are added at the end of the following command: <code>dig +short </code>."
|
||||
},
|
||||
{
|
||||
"language_code": "es_es",
|
||||
"string": "Cambie los argumentos de la <a href=\"https://linux.die.net/man/1/dig\" target=\"_blank\">utilidad de dig</a> si tiene problemas para resolver su IP de Internet. Los argumentos se agregan al final del siguiente comando: <code>dig +short </code>."
|
||||
},
|
||||
{
|
||||
"language_code": "de_de",
|
||||
"string": "Ändere die Argumente des <a href=\"https://linux.die.net/man/1/dig\" target=\"_blank\">dig Dienstprogramms</a>, wenn Probleme beim Auflösen der externen IP auftreten. Argumente werden an das Ende des folgenden Befehls angehängt: <code>dig +short </code>."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"function": "RUN_SCHD",
|
||||
"type": "text",
|
||||
|
||||
@@ -20,7 +20,7 @@ sys.path.append('/home/pi/pialert/pialert')
|
||||
|
||||
from plugin_helper import Plugin_Object, Plugin_Objects, decodeBase64
|
||||
from logger import mylog, append_line_to_file
|
||||
from helper import timeNowTZ, check_IP_format
|
||||
from helper import timeNowTZ, check_IP_format, get_setting_value
|
||||
from const import logPath, pialertPath, fullDbPath
|
||||
|
||||
|
||||
@@ -37,19 +37,19 @@ def main():
|
||||
parser = argparse.ArgumentParser(description='Check internet connectivity and IP')
|
||||
|
||||
parser.add_argument('prev_ip', action="store", help="Previous IP address to compare against the current IP")
|
||||
parser.add_argument('DIG_GET_IP_ARG', action="store", help="Arguments for the 'dig' command to retrieve the IP address")
|
||||
parser.add_argument('DIG_GET_IP_ARG', action="store", help="Arguments for the 'dig' command to retrieve the IP address") # unused
|
||||
|
||||
values = parser.parse_args()
|
||||
|
||||
PREV_IP = values.prev_ip.split('=')[1]
|
||||
DIG_GET_IP_ARG = values.DIG_GET_IP_ARG.split('=b')[1] # byte64 encoded
|
||||
DIG_GET_IP_ARG = get_setting_value("INTRNT_DIG_GET_IP_ARG")
|
||||
|
||||
mylog('verbose', [f'[{pluginName}] DIG_GET_IP_ARG: ', DIG_GET_IP_ARG])
|
||||
mylog('verbose', [f'[{pluginName}] INTRNT_DIG_GET_IP_ARG: ', DIG_GET_IP_ARG])
|
||||
|
||||
# Decode the base64-encoded value to get the actual value in ASCII format.
|
||||
DIG_GET_IP_ARG = base64.b64decode(DIG_GET_IP_ARG).decode('ascii')
|
||||
# DIG_GET_IP_ARG = base64.b64decode(DIG_GET_IP_ARG).decode('ascii')
|
||||
|
||||
mylog('verbose', [f'[{pluginName}] DIG_GET_IP_ARG resolved: {DIG_GET_IP_ARG} '])
|
||||
# mylog('verbose', [f'[{pluginName}] DIG_GET_IP_ARG resolved: {DIG_GET_IP_ARG} '])
|
||||
|
||||
# perform the new IP lookup
|
||||
new_internet_IP, cmd_output = check_internet_IP( PREV_IP, DIG_GET_IP_ARG)
|
||||
|
||||
@@ -395,7 +395,7 @@
|
||||
{
|
||||
"function": "dev_NewDevice",
|
||||
"type": "integer.checkbox",
|
||||
"default_value": true,
|
||||
"default_value": 1,
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
"name": [
|
||||
|
||||
7
front/plugins/notification_processing/README.md
Executable file
7
front/plugins/notification_processing/README.md
Executable file
@@ -0,0 +1,7 @@
|
||||
## Overview
|
||||
|
||||
Plugin supplying settings for Notification Processing.
|
||||
|
||||
### Usage
|
||||
|
||||
- Check the Settings page for details.
|
||||
129
front/plugins/notification_processing/config.json
Executable file
129
front/plugins/notification_processing/config.json
Executable file
@@ -0,0 +1,129 @@
|
||||
{
|
||||
"code_name": "notification_processing",
|
||||
"unique_prefix": "NTFPRCS",
|
||||
"plugin_type": "system",
|
||||
"enabled": true,
|
||||
"data_source": "script",
|
||||
"show_ui": false,
|
||||
"localized": ["display_name", "description", "icon"],
|
||||
"display_name": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Notification Processing"
|
||||
}
|
||||
],
|
||||
"icon": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "<i class=\"fa-solid fa-envelopes-bulk\"></i>"
|
||||
}
|
||||
],
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "A plugin to for advanced notification processing."
|
||||
}
|
||||
],
|
||||
"params" : [
|
||||
],
|
||||
|
||||
"settings": [
|
||||
{
|
||||
"function": "INCLUDED_SECTIONS",
|
||||
"type": "text.multiselect",
|
||||
"default_value": ["new_devices", "down_devices", "events"],
|
||||
"options": ["new_devices", "down_devices", "events", "plugins"],
|
||||
"localized": ["name", "description"],
|
||||
"name": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Notify on"
|
||||
},
|
||||
{
|
||||
"language_code": "de_de",
|
||||
"string": "Benachrichtigungen"
|
||||
},
|
||||
{
|
||||
"language_code": "es_es",
|
||||
"string": "Notificar en"
|
||||
}
|
||||
],
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Specifies which events trigger notifications. Remove the event type(s) you do not want to get notified on. This setting overrides device-specific settings in the UI. (<code>CTRL + Click</code> to select/deselect)."
|
||||
},
|
||||
{
|
||||
"language_code": "de_de",
|
||||
"string": "Spezifiziert, bei welchen Events Benachrichtigungen versendet werden. Entfernen Sie die Eventtypen, bei welchen Sie nicht benachrichtigt werden wollen. Diese Einstellung überschreibt gerätespezifische Einstellungen im UI. (<code>STRG + klicken</code> zum aus-/abwählen)."
|
||||
},
|
||||
{
|
||||
"language_code": "es_es",
|
||||
"string": "Especifica que eventos envían notificaciones. Elimina los tipos de eventos de los que no quieras recibir notificaciones. Este ajuste sobreescribe los ajustes específicos de los dispositivos en la interfaz. (<code>CTRL + Clic</code> para seleccionar / deseleccionar)."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"function": "alert_down_time",
|
||||
"type": "integer",
|
||||
"default_value": 5,
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
"name": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Alert Down After"
|
||||
}
|
||||
],
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "After how many minutes a device is reported as down and a notification is sent."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"function": "new_dev_condition",
|
||||
"type": "text",
|
||||
"default_value": "",
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
"name": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "New Devices Filter"
|
||||
}
|
||||
],
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "You can specify a SQL where condition to filter out New Devices from notifications. For example <code>AND dev_LastIP NOT LIKE '192.168.3.%'</code> will always exlude New Device notifications for all devices with the IP starting with <code>192.168.3.%</code>."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"function": "event_condition",
|
||||
"type": "text",
|
||||
"default_value": "",
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
"name": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Events Filter"
|
||||
}
|
||||
],
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "You can specify a SQL where condition to filter out Events from notifications. For example <code>AND dev_LastIP NOT LIKE '192.168.3.%'</code> will always exlude New Device notifications for all devices with the IP starting with <code>192.168.3.%</code>."
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
|
||||
"database_column_definitions":
|
||||
[
|
||||
|
||||
]
|
||||
}
|
||||
@@ -73,7 +73,7 @@
|
||||
}],
|
||||
"description": [{
|
||||
"language_code":"en_us",
|
||||
"string" : "Specify when your PiHole device import from the PiHole databse will run. The typical setting would be <code>schedule</code> and then you specify a cron-like schedule in the <a href=\"#PIHOLE_RUN_SCHD\"><code>PIHOLE_RUN_SCHD</code>setting</a>. If enabled, you must map the pihole db into your container to the <code>:/etc/pihole/pihole-FTL.db</code> mount path as specified in the <code>DB_PATH</code> setting."
|
||||
"string" : "Specify when your PiHole device import from the PiHole database will run. The typical setting would be <code>schedule</code> and then you specify a cron-like schedule in the <a href=\"#PIHOLE_RUN_SCHD\"><code>PIHOLE_RUN_SCHD</code>setting</a>. If enabled, you must map the pihole db into your container to the <code>:/etc/pihole/pihole-FTL.db</code> mount path as specified in the <code>DB_PATH</code> setting. ⚠ Use the same schedule if you have multiple <i class=\"fa-solid fa-magnifying-glass-plus\"></i> Device scanners enabled."
|
||||
},
|
||||
{
|
||||
"language_code":"es_es",
|
||||
@@ -83,7 +83,7 @@
|
||||
{
|
||||
"function": "CMD",
|
||||
"type": "text",
|
||||
"default_value":"SELECT n.hwaddr AS Object_PrimaryID, {s-quote}null{s-quote} AS Object_SecondaryID, datetime() AS DateTime, na.ip AS Watched_Value1, n.lastQuery AS Watched_Value2, na.name AS Watched_Value3, n.macVendor AS Watched_Value4, {s-quote}null{s-quote} AS Extra, n.hwaddr AS ForeignKey FROM EXTERNAL_PIHOLE.Network AS n LEFT JOIN EXTERNAL_PIHOLE.Network_Addresses AS na ON na.network_id = n.id WHERE n.hwaddr NOT LIKE {s-quote}ip-%{s-quote} AND n.hwaddr <> {s-quote}00:00:00:00:00:00{s-quote} AND na.ip <> null;",
|
||||
"default_value":"SELECT n.hwaddr AS Object_PrimaryID, {s-quote}null{s-quote} AS Object_SecondaryID, datetime() AS DateTime, na.ip AS Watched_Value1, n.lastQuery AS Watched_Value2, na.name AS Watched_Value3, n.macVendor AS Watched_Value4, {s-quote}null{s-quote} AS Extra, n.hwaddr AS ForeignKey FROM EXTERNAL_PIHOLE.Network AS n LEFT JOIN EXTERNAL_PIHOLE.Network_Addresses AS na ON na.network_id = n.id WHERE n.hwaddr NOT LIKE {s-quote}ip-%{s-quote} AND n.hwaddr is not {s-quote}00:00:00:00:00:00{s-quote} AND na.ip is not null",
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
"name" : [{
|
||||
|
||||
@@ -45,6 +45,11 @@ def handleEmpty(input):
|
||||
input = re.sub(r'[^\x00-\x7F]+', ' ', input)
|
||||
return input
|
||||
|
||||
# -------------------------------------------------------------------
|
||||
# Check if a valid MAC address
|
||||
def is_mac(input):
|
||||
return re.match("[0-9a-f]{2}([-:]?)[0-9a-f]{2}(\\1[0-9a-f]{2}){4}$", input.lower())
|
||||
|
||||
# -------------------------------------------------------------------
|
||||
def decodeBase64(inputParamBase64):
|
||||
|
||||
|
||||
@@ -67,7 +67,7 @@
|
||||
},
|
||||
{
|
||||
"function": "CMD",
|
||||
"type": "text",
|
||||
"type": "readonly",
|
||||
"default_value":"/home/pi/pialert/back/pialert-cli set_password {password}",
|
||||
"options": [],
|
||||
"localized": ["name", "description"],
|
||||
@@ -108,7 +108,7 @@
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "The default password is <code>123456</code>. To change the password run <code>/home/pi/pialert/back/pialert-cli set_password {password}</code> in the container"
|
||||
"string": "The default password is <code>123456</code>. To change it, you can either use this plugin (follow the instructions in the <code>SETPWD_RUN</code> setting) or run <code>/home/pi/pialert/back/pialert-cli set_password {password}</code> in the container."
|
||||
},
|
||||
{
|
||||
"language_code": "es_es",
|
||||
|
||||
@@ -322,7 +322,7 @@
|
||||
}],
|
||||
"description": [{
|
||||
"language_code":"en_us",
|
||||
"string" : "Enable import of devices from a SNMP enabled device. If you select <code>schedule</code> the scheduling settings from below are applied. If you select <code>once</code> the scan is run only once on start of the application (container) or after you update your settings."
|
||||
"string" : "Enable import of devices from a SNMP enabled device. If you select <code>schedule</code> the scheduling settings from below are applied. If you select <code>once</code> the scan is run only once on start of the application (container) or after you update your settings. ⚠ Use the same schedule if you have multiple <i class=\"fa-solid fa-magnifying-glass-plus\"></i> Device scanners enabled."
|
||||
},
|
||||
{
|
||||
"language_code":"es_es",
|
||||
|
||||
@@ -13,7 +13,7 @@ import sys
|
||||
sys.path.append("/home/pi/pialert/front/plugins")
|
||||
sys.path.append('/home/pi/pialert/pialert')
|
||||
|
||||
from plugin_helper import Plugin_Object, Plugin_Objects, decodeBase64
|
||||
from plugin_helper import Plugin_Object, Plugin_Objects, decodeBase64, handleEmpty
|
||||
from logger import mylog
|
||||
from helper import timeNowTZ
|
||||
from const import logPath, pialertPath
|
||||
|
||||
@@ -479,7 +479,7 @@
|
||||
"description": [
|
||||
{
|
||||
"language_code": "en_us",
|
||||
"string": "Enable import of devices from a UNIFI controller. If you select <code>schedule</code> the scheduling settings from below are applied. If you select <code>once</code> the scan is run only once on start of the application (container) or after you update your settings."
|
||||
"string": "Enable import of devices from a UNIFI controller. If you select <code>schedule</code> the scheduling settings from below are applied. If you select <code>once</code> the scan is run only once on start of the application (container) or after you update your settings. ⚠ Use the same schedule if you have multiple <i class=\"fa-solid fa-magnifying-glass-plus\"></i> Device scanners enabled."
|
||||
},
|
||||
{
|
||||
"language_code": "es_es",
|
||||
|
||||
@@ -32,6 +32,9 @@ LOCK_FILE = os.path.join(CUR_PATH, 'full_run.lock')
|
||||
|
||||
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
|
||||
|
||||
|
||||
pluginName = 'UNFIMP'
|
||||
|
||||
# Workflow
|
||||
|
||||
def main():
|
||||
@@ -131,9 +134,15 @@ def get_entries(plugin_objects: Plugin_Objects) -> Plugin_Objects:
|
||||
|
||||
name = set_name(name, hostName)
|
||||
|
||||
ipTmp = get_unifi_val(ap, 'ip')
|
||||
|
||||
# if IP not found use a default value
|
||||
if ipTmp == "null":
|
||||
ipTmp = '0.0.0.0'
|
||||
|
||||
plugin_objects.add_object(
|
||||
primaryId=ap['mac'],
|
||||
secondaryId=get_unifi_val(ap, 'ip'),
|
||||
secondaryId=ipTmp,
|
||||
watched1=name,
|
||||
watched2='Ubiquiti Networks Inc.',
|
||||
watched3=deviceType,
|
||||
@@ -175,6 +184,10 @@ def get_entries(plugin_objects: Plugin_Objects) -> Plugin_Objects:
|
||||
if ipTmp == 'null':
|
||||
ipTmp = get_unifi_val(user, 'fixed_ip')
|
||||
|
||||
# if IP not found use a default value
|
||||
if ipTmp == "null":
|
||||
ipTmp = '0.0.0.0'
|
||||
|
||||
plugin_objects.add_object(
|
||||
primaryId=user['mac'],
|
||||
secondaryId=ipTmp,
|
||||
@@ -206,6 +219,7 @@ def get_unifi_val(obj, key):
|
||||
if res not in ['','None', None]:
|
||||
return res
|
||||
|
||||
mylog('debug', [f'[{pluginName}] Value not found for key "{key}" in obj "{json.dumps(obj)}"'])
|
||||
|
||||
return 'null'
|
||||
|
||||
|
||||
@@ -19,7 +19,7 @@ sys.path.append('/home/pi/pialert/pialert')
|
||||
from plugin_helper import Plugin_Object, Plugin_Objects, decodeBase64, handleEmpty
|
||||
from logger import mylog, append_line_to_file
|
||||
from helper import timeNowTZ
|
||||
from const import logPath, pialertPath
|
||||
from const import logPath, pialertPath, fullDbPath
|
||||
from device import query_MAC_vendor
|
||||
|
||||
|
||||
@@ -37,7 +37,7 @@ def main():
|
||||
# Resolve missing vendors
|
||||
plugin_objects = Plugin_Objects(RESULT_FILE)
|
||||
|
||||
plugin_objects = update_vendors('/home/pi/pialert/db/pialert.db', plugin_objects)
|
||||
plugin_objects = update_vendors(fullDbPath, plugin_objects)
|
||||
|
||||
plugin_objects.write_result_file()
|
||||
|
||||
@@ -60,8 +60,8 @@ def update_vendor_database():
|
||||
update_output = subprocess.check_output (update_args)
|
||||
except subprocess.CalledProcessError as e:
|
||||
# An error occured, handle it
|
||||
mylog('none', [' FAILED: Updating vendors DB, set LOG_LEVEL=debug for more info'])
|
||||
mylog('none', [e.output])
|
||||
mylog('verbose', [' FAILED: Updating vendors DB, set LOG_LEVEL=debug for more info'])
|
||||
mylog('verbose', [e.output])
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# resolve missing vendors
|
||||
|
||||
@@ -173,7 +173,7 @@ function processColumnValue(dbColumnDef, value, index, type) {
|
||||
|
||||
for (const option of dbColumnDef.options) {
|
||||
if (option.type === type) {
|
||||
console.log(option.param)
|
||||
// console.log(option.param)
|
||||
value = eval(option.param);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -245,21 +245,11 @@ while ($row = $result -> fetchArray (SQLITE3_ASSOC)) {
|
||||
`)
|
||||
}
|
||||
|
||||
// Start constructing the main settings HTML
|
||||
let pluginHtml = `
|
||||
<div class="row table_row">
|
||||
<div class="table_cell bold">
|
||||
<i class="fa-regular fa-book fa-sm"></i>
|
||||
<a href="https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins" target="_blank">
|
||||
${getString('Gen_ReadDocs')}
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
|
||||
let isIn = ' in '; // to open the active panel in AdminLTE
|
||||
|
||||
for (const group of settingGroups) {
|
||||
for (const group of settingGroups) {
|
||||
|
||||
// enabled / disabled icons
|
||||
enabledHtml = ''
|
||||
@@ -277,7 +267,20 @@ while ($row = $result -> fetchArray (SQLITE3_ASSOC)) {
|
||||
`
|
||||
}
|
||||
|
||||
headerHtml = `<div class="box box-solid box-primary panel panel-default">
|
||||
// Start constructing the main settings HTML
|
||||
let pluginHtml = `
|
||||
<div class="row table_row">
|
||||
<div class="table_cell bold">
|
||||
<i class="fa-regular fa-book fa-sm"></i>
|
||||
<a href="https://github.com/jokob-sk/Pi.Alert/tree/main/front/plugins/${getPluginCodeName(pluginsData, group)}" target="_blank">
|
||||
${getString('Gen_ReadDocs')}
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
// Plugin HEADER
|
||||
headerHtml = `<div class="box box-solid box-primary panel panel-default" id="${group}_header">
|
||||
<a data-toggle="collapse" data-parent="#accordion_gen" href="#${group}">
|
||||
<div class="panel-heading">
|
||||
<h4 class="panel-title">
|
||||
@@ -787,37 +790,6 @@ while ($row = $result -> fetchArray (SQLITE3_ASSOC)) {
|
||||
|
||||
}
|
||||
|
||||
|
||||
|
||||
// -----------------------------------------------------------------------------
|
||||
function toggleAllSettings()
|
||||
{
|
||||
inStr = ' in';
|
||||
allOpen = true;
|
||||
openIcon = 'fa-angle-double-down';
|
||||
closeIcon = 'fa-angle-double-up';
|
||||
|
||||
$('.panel-collapse').each(function(){
|
||||
if($(this).attr('class').indexOf(inStr) == -1)
|
||||
{
|
||||
allOpen = false;
|
||||
}
|
||||
})
|
||||
|
||||
if(allOpen)
|
||||
{
|
||||
// close all
|
||||
$('div[data-myid="collapsible"]').each(function(){$(this).attr('class', 'panel-collapse collapse ')})
|
||||
$('#toggleSettings').attr('class', $('#toggleSettings').attr('class').replace(closeIcon, openIcon))
|
||||
}
|
||||
else{
|
||||
// open all
|
||||
$('div[data-myid="collapsible"]').each(function(){$(this).attr('class', 'panel-collapse collapse in')})
|
||||
$('div[data-myid="collapsible"]').each(function(){$(this).attr('style', 'height:inherit')})
|
||||
$('#toggleSettings').attr('class', $('#toggleSettings').attr('class').replace(openIcon, closeIcon))
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
getData()
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
#!/bin/bash
|
||||
#!/usr/bin/env bash
|
||||
|
||||
echo "---------------------------------------------------------"
|
||||
echo "[INSTALL] Run install.sh"
|
||||
@@ -26,6 +26,10 @@ rm -R $INSTALL_DIR/pialert
|
||||
# Clone the application repository
|
||||
git clone https://github.com/jokob-sk/Pi.Alert "$INSTALL_DIR/pialert"
|
||||
|
||||
# Check for buildtimestamp.txt existence, otherwise create it
|
||||
if [ ! -f $INSTALL_DIR/pialert/front/buildtimestamp.txt ]; then
|
||||
date +%s > $INSTALL_DIR/pialert/front/buildtimestamp.txt
|
||||
fi
|
||||
|
||||
# Start PiAlert
|
||||
"$INSTALL_DIR/pialert/dockerfiles/start.sh"
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
#!/bin/bash
|
||||
#!/usr/bin/env bash
|
||||
|
||||
echo "---------------------------------------------------------"
|
||||
echo "[INSTALL] Run install_dependencies.sh"
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
server {
|
||||
listen 80 default_server;
|
||||
root /var/www/html;
|
||||
listen 20211 default_server;
|
||||
root /var/www/html/pialert;
|
||||
index index.php;
|
||||
#rewrite /pialert/(.*) / permanent;
|
||||
add_header X-Forwarded-Prefix "/pialert" always;
|
||||
@@ -15,4 +15,4 @@ server {
|
||||
fastcgi_send_timeout 600;
|
||||
fastcgi_read_timeout 600;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -7,18 +7,22 @@ The original pilaert.py code is now moved to this new folder and split into diff
|
||||
|```__main__.py```| The MAIN program of Pi.Alert|
|
||||
|```__init__.py```| an empty init file|
|
||||
|```README.md```| this readme file|
|
||||
|**publishers**| a folder containing all modules used to publish the results|
|
||||
|```api.py```| updating the API endpoints with the relevant data. (Should move to publishers)|
|
||||
|```../front/plugins ```| a folder containing all [plugins](/front/plugins/) that publish notifications or scan for devices|
|
||||
|```api.py```| updating the API endpoints with the relevant data. |
|
||||
|```appevent.py```| TBC |
|
||||
|```const.py```| A place to define the constants for Pi.Alert like log path or config path.|
|
||||
|```conf.py```| conf.py holds the configuration variables and makes them available for all modules. It is also the <b>workaround</b> for global variables that need to be resolved at some point|
|
||||
|```database.py```| This module connects to the DB, makes sure the DB is up to date and defines some standard queries and interfaces. |
|
||||
|```device.py```| The device module looks after the devices and saves the scan results into the devices |
|
||||
|```flows.py```| TBC |
|
||||
|```helper.py```| Helper as the name suggest contains multiple little functions and methods used in many of the other modules and helps keep things clean |
|
||||
|```initialise.py```| Initiatlise sets up the environment and makes everything ready to go |
|
||||
|```logger.py```| Logger is there the keep all the logs organised and looking identical. |
|
||||
|```networscan.py```| Networkscan orchestrates the actual scanning of the network, calling the individual scanners and managing the results |
|
||||
|```networscan.py```| Networkscan collects the scan results (maybe to merge with `reporting.py`) |
|
||||
|```notification.py```| Creates and handles the notification object and generates ther HTML and text variants of the message |
|
||||
|```plugin.py```| This is where the plugins get integrated into the backend of Pi.Alert |
|
||||
|```reporting.py```| Reporting generates the email, html and json reports to be sent by the publishers |
|
||||
|```plugin_utils.py```| Helper utilities for `plugin.py` |
|
||||
|```reporting.py```| Reporting collects the data for the notification reports |
|
||||
|```scheduler.py```| All things scheduling |
|
||||
|
||||
|
||||
|
||||
@@ -123,7 +123,7 @@ def main ():
|
||||
# determine run/scan type based on passed time
|
||||
# --------------------------------------------
|
||||
|
||||
# Run splugin scripts which are set to run every timne after a scans finished
|
||||
# Runs plugin scripts which are set to run every timne after a scans finished
|
||||
pluginsState = run_plugin_scripts(db,'always_after_scan', pluginsState)
|
||||
|
||||
|
||||
@@ -151,32 +151,19 @@ def main ():
|
||||
# ----------------------------------------
|
||||
|
||||
# send all configured notifications
|
||||
notiStructure = get_notifications(db)
|
||||
final_json = get_notifications(db)
|
||||
|
||||
# Write the notifications into the DB
|
||||
notification = Notification_obj(db)
|
||||
notificationObj = notification.create(notiStructure.json, notiStructure.text, notiStructure.html, "")
|
||||
notificationObj = notification.create(final_json, "")
|
||||
|
||||
# run all enabled publisher gateways
|
||||
if notificationObj.HasNotifications:
|
||||
pluginsState = run_plugin_scripts(db, 'on_notification', pluginsState)
|
||||
notification.setAllProcessed()
|
||||
notification.clearPendingEmailFlag()
|
||||
|
||||
# Clean Pending Alert Events
|
||||
sql.execute ("""UPDATE Devices SET dev_LastNotification = ?
|
||||
WHERE dev_MAC IN (
|
||||
SELECT eve_MAC FROM Events
|
||||
WHERE eve_PendingAlertEmail = 1
|
||||
)
|
||||
""", (timeNowTZ(),) )
|
||||
sql.execute ("""UPDATE Events SET eve_PendingAlertEmail = 0
|
||||
WHERE eve_PendingAlertEmail = 1""")
|
||||
|
||||
# clear plugin events
|
||||
sql.execute ("DELETE FROM Plugins_Events")
|
||||
|
||||
# DEBUG - print number of rows updated
|
||||
mylog('minimal', ['[Notification] Notifications changes: ', sql.rowcount])
|
||||
|
||||
else:
|
||||
mylog('verbose', ['[Notification] No changes to report'])
|
||||
|
||||
|
||||
@@ -32,12 +32,10 @@ arpscan_devices = []
|
||||
SCAN_SUBNETS = ['192.168.1.0/24 --interface=eth1', '192.168.1.0/24 --interface=eth0']
|
||||
LOG_LEVEL = 'verbose'
|
||||
TIMEZONE = 'Europe/Berlin'
|
||||
DIG_GET_IP_ARG = '-4 myip.opendns.com @resolver1.opendns.com'
|
||||
UI_LANG = 'English'
|
||||
UI_PRESENCE = ['online', 'offline', 'archived']
|
||||
PIALERT_WEB_PROTECTION = False
|
||||
PIALERT_WEB_PASSWORD = '8d969eef6ecad3c29a3a629280e686cf0c3f5d5a86aff3ca12020c923adc6c92'
|
||||
INCLUDED_SECTIONS = ['new_devices', 'down_devices', 'events']
|
||||
DAYS_TO_KEEP_EVENTS = 90
|
||||
REPORT_DASHBOARD_URL = 'http://pi.alert/'
|
||||
|
||||
|
||||
@@ -24,7 +24,7 @@ class DB():
|
||||
def open (self):
|
||||
# Check if DB is open
|
||||
if self.sql_connection != None :
|
||||
mylog('debug','openDB: databse already open')
|
||||
mylog('debug','openDB: database already open')
|
||||
return
|
||||
|
||||
mylog('none', '[Database] Opening DB' )
|
||||
@@ -42,7 +42,7 @@ class DB():
|
||||
#-------------------------------------------------------------------------------
|
||||
def commitDB (self):
|
||||
if self.sql_connection == None :
|
||||
mylog('debug','commitDB: databse is not open')
|
||||
mylog('debug','commitDB: database is not open')
|
||||
return False
|
||||
|
||||
# Commit changes to DB
|
||||
@@ -57,7 +57,7 @@ class DB():
|
||||
#-------------------------------------------------------------------------------
|
||||
def get_sql_array(self, query):
|
||||
if self.sql_connection == None :
|
||||
mylog('debug','getQueryArray: databse is not open')
|
||||
mylog('debug','getQueryArray: database is not open')
|
||||
return
|
||||
|
||||
self.sql.execute(query)
|
||||
|
||||
@@ -41,8 +41,8 @@ def print_scan_stats(db):
|
||||
SELECT
|
||||
(SELECT COUNT(*) FROM CurrentScan) AS devices_detected,
|
||||
(SELECT COUNT(*) FROM CurrentScan WHERE NOT EXISTS (SELECT 1 FROM Devices WHERE dev_MAC = cur_MAC)) AS new_devices,
|
||||
(SELECT COUNT(*) FROM Devices WHERE dev_AlertDeviceDown = 1 AND NOT EXISTS (SELECT 1 FROM CurrentScan WHERE dev_MAC = cur_MAC)) AS down_alerts,
|
||||
(SELECT COUNT(*) FROM Devices WHERE dev_AlertDeviceDown = 1 AND dev_PresentLastScan = 1 AND NOT EXISTS (SELECT 1 FROM CurrentScan WHERE dev_MAC = cur_MAC)) AS new_down_alerts,
|
||||
(SELECT COUNT(*) FROM Devices WHERE dev_AlertDeviceDown != 0 AND NOT EXISTS (SELECT 1 FROM CurrentScan WHERE dev_MAC = cur_MAC)) AS down_alerts,
|
||||
(SELECT COUNT(*) FROM Devices WHERE dev_AlertDeviceDown != 0 AND dev_PresentLastScan = 1 AND NOT EXISTS (SELECT 1 FROM CurrentScan WHERE dev_MAC = cur_MAC)) AS new_down_alerts,
|
||||
(SELECT COUNT(*) FROM Devices WHERE dev_PresentLastScan = 0) AS new_connections,
|
||||
(SELECT COUNT(*) FROM Devices WHERE dev_PresentLastScan = 1 AND NOT EXISTS (SELECT 1 FROM CurrentScan WHERE dev_MAC = cur_MAC)) AS disconnections,
|
||||
(SELECT COUNT(*) FROM Devices, CurrentScan WHERE dev_MAC = cur_MAC AND dev_LastIP <> cur_IP) AS ip_changes,
|
||||
@@ -200,17 +200,30 @@ def update_devices_data_from_scan (db):
|
||||
WHERE NOT EXISTS (SELECT 1 FROM CurrentScan
|
||||
WHERE dev_MAC = cur_MAC) """)
|
||||
|
||||
# Update IP & Vendor
|
||||
mylog('debug', '[Update Devices] - 3 LastIP & Vendor')
|
||||
# Update IP
|
||||
mylog('debug', '[Update Devices] - 3 LastIP ')
|
||||
sql.execute("""UPDATE Devices
|
||||
SET dev_LastIP = (SELECT cur_IP FROM CurrentScan
|
||||
WHERE dev_MAC = cur_MAC),
|
||||
dev_Vendor = (SELECT cur_Vendor FROM CurrentScan
|
||||
WHERE dev_MAC = cur_MAC
|
||||
)
|
||||
WHERE dev_MAC = cur_MAC)
|
||||
WHERE EXISTS (SELECT 1 FROM CurrentScan
|
||||
WHERE dev_MAC = cur_MAC) """)
|
||||
|
||||
# Update only devices with empty or NULL vendors
|
||||
mylog('debug', '[Update Devices] - 3 Vendor')
|
||||
sql.execute("""UPDATE Devices
|
||||
SET dev_Vendor = (
|
||||
SELECT cur_Vendor
|
||||
FROM CurrentScan
|
||||
WHERE dev_MAC = cur_MAC
|
||||
)
|
||||
WHERE
|
||||
(dev_Vendor = "" OR dev_Vendor IS NULL)
|
||||
AND EXISTS (
|
||||
SELECT 1
|
||||
FROM CurrentScan
|
||||
WHERE dev_MAC = cur_MAC
|
||||
)""")
|
||||
|
||||
# Update (unknown) or (name not found) Names if available
|
||||
mylog('debug','[Update Devices] - 4 Unknown Name')
|
||||
sql.execute ("""UPDATE Devices
|
||||
@@ -352,10 +365,15 @@ def query_MAC_vendor (pMAC):
|
||||
try:
|
||||
with open(vendorsPath, 'r') as f:
|
||||
for line in f:
|
||||
if line.startswith(mac_start_string6):
|
||||
vendor = line.split(' ', 1)[1].strip()
|
||||
mylog('debug', [f"[Vendor Check] Found '{vendor}' for '{pMAC}' in {vendorsPath}"])
|
||||
return vendor
|
||||
if line.startswith(mac_start_string6):
|
||||
parts = line.split(' ', 1)
|
||||
if len(parts) > 1:
|
||||
vendor = parts[1].strip()
|
||||
mylog('debug', [f"[Vendor Check] Found '{vendor}' for '{pMAC}' in {vendorsPath}"])
|
||||
return vendor
|
||||
else:
|
||||
mylog('debug', [f'[Vendor Check] ⚠ ERROR: Match found, but line could not be processed: "{line}"'])
|
||||
return -1
|
||||
|
||||
|
||||
return -1 # MAC address not found in the database
|
||||
|
||||
@@ -37,6 +37,12 @@ def timeNowTZ():
|
||||
def timeNow():
|
||||
return datetime.datetime.now().replace(microsecond=0)
|
||||
|
||||
def get_timezone_offset():
|
||||
now = datetime.datetime.now(conf.tz)
|
||||
offset_hours = now.utcoffset().total_seconds() / 3600
|
||||
offset_formatted = "{:+03d}:{:02d}".format(int(offset_hours), int((offset_hours % 1) * 60))
|
||||
return offset_formatted
|
||||
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
# App state
|
||||
@@ -376,7 +382,7 @@ def resolve_device_name_dig (pMAC, pIP):
|
||||
# Cleanup
|
||||
newName = cleanDeviceName(newName, True)
|
||||
|
||||
if newName == "" or len(newName) == 0 or newName == '-1' or newName == -1 or "communications error" in newName:
|
||||
if newName == "" or len(newName) == 0 or newName == '-1' or newName == -1 or "communications error" in newName or 'malformed message packet' in newName :
|
||||
return nameNotFound
|
||||
|
||||
# all checks passed
|
||||
@@ -470,58 +476,6 @@ def resolve_device_name_pholus (pMAC, pIP, allRes, nameNotFound, match_IP = Fals
|
||||
if 'PTR Class:IN' in value and len(value.split('"')) > 1:
|
||||
return cleanDeviceName(value.split('"')[1], match_IP)
|
||||
|
||||
|
||||
# # airplay matches contain a lot of information
|
||||
# # Matches for example:
|
||||
# # Brand Tv (50)._airplay._tcp.local. TXT Class:32769 "acl=0 deviceid=66:66:66:66:66:66 features=0x77777,0x38BCB46 rsf=0x3 fv=p20.T-FFFFFF-03.1 flags=0x204 model=XXXX manufacturer=Brand serialNumber=XXXXXXXXXXX protovers=1.1 srcvers=777.77.77 pi=FF:FF:FF:FF:FF:FF psi=00000000-0000-0000-0000-FFFFFFFFFF gid=00000000-0000-0000-0000-FFFFFFFFFF gcgl=0 pk=AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"
|
||||
# for i in pholusMatchesIndexes:
|
||||
# if checkIPV4(allRes[i]['IP_v4_or_v6']) and '._airplay._tcp.local. TXT Class:32769' in str(allRes[i]["Value"]) :
|
||||
# return cleanDeviceName(allRes[i]["Value"].split('._airplay._tcp.local. TXT Class:32769')[0], match_IP)
|
||||
|
||||
# # second best - contains airplay
|
||||
# # Matches for example:
|
||||
# # _airplay._tcp.local. PTR Class:IN "Brand Tv (50)._airplay._tcp.local."
|
||||
# for i in pholusMatchesIndexes:
|
||||
# if checkIPV4(allRes[i]['IP_v4_or_v6']) and '_airplay._tcp.local. PTR Class:IN' in allRes[i]["Value"] and ('._googlecast') not in allRes[i]["Value"]:
|
||||
# return cleanDeviceName(allRes[i]["Value"].split('"')[1], match_IP)
|
||||
|
||||
# # Contains PTR Class:32769
|
||||
# # Matches for example:
|
||||
# # 3.1.168.192.in-addr.arpa. PTR Class:32769 "MyPc.local."
|
||||
# for i in pholusMatchesIndexes:
|
||||
# if checkIPV4(allRes[i]['IP_v4_or_v6']) and 'PTR Class:32769' in allRes[i]["Value"]:
|
||||
# return cleanDeviceName(allRes[i]["Value"].split('"')[1], match_IP)
|
||||
|
||||
# # Contains AAAA Class:IN
|
||||
# # Matches for example:
|
||||
# # DESKTOP-SOMEID.local. AAAA Class:IN "fe80::fe80:fe80:fe80:fe80"
|
||||
# for i in pholusMatchesIndexes:
|
||||
# if checkIPV4(allRes[i]['IP_v4_or_v6']) and 'AAAA Class:IN' in allRes[i]["Value"]:
|
||||
# return cleanDeviceName(allRes[i]["Value"].split('.local.')[0], match_IP)
|
||||
|
||||
# # Contains _googlecast._tcp.local. PTR Class:IN
|
||||
# # Matches for example:
|
||||
# # _googlecast._tcp.local. PTR Class:IN "Nest-Audio-ff77ff77ff77ff77ff77ff77ff77ff77._googlecast._tcp.local."
|
||||
# for i in pholusMatchesIndexes:
|
||||
# if checkIPV4(allRes[i]['IP_v4_or_v6']) and '_googlecast._tcp.local. PTR Class:IN' in allRes[i]["Value"] and ('Google-Cast-Group') not in allRes[i]["Value"]:
|
||||
# return cleanDeviceName(allRes[i]["Value"].split('"')[1], match_IP)
|
||||
|
||||
# # Contains A Class:32769
|
||||
# # Matches for example:
|
||||
# # Android.local. A Class:32769 "192.168.1.6"
|
||||
# for i in pholusMatchesIndexes:
|
||||
# if checkIPV4(allRes[i]['IP_v4_or_v6']) and ' A Class:32769' in allRes[i]["Value"]:
|
||||
# return cleanDeviceName(allRes[i]["Value"].split(' A Class:32769')[0], match_IP)
|
||||
|
||||
|
||||
# # # Contains PTR Class:IN
|
||||
# # Matches for example:
|
||||
# # _esphomelib._tcp.local. PTR Class:IN "ceiling-light-1._esphomelib._tcp.local."
|
||||
# for i in pholusMatchesIndexes:
|
||||
# if checkIPV4(allRes[i]['IP_v4_or_v6']) and 'PTR Class:IN' in allRes[i]["Value"]:
|
||||
# if allRes[i]["Value"] and len(allRes[i]["Value"].split('"')) > 1:
|
||||
# return cleanDeviceName(allRes[i]["Value"].split('"')[1], match_IP)
|
||||
|
||||
return nameNotFound
|
||||
|
||||
|
||||
@@ -709,9 +663,9 @@ def checkNewVersion():
|
||||
|
||||
dateTimeStr = data[0]["published_at"]
|
||||
|
||||
realeaseTimestamp = int(datetime.datetime.strptime(dateTimeStr, '%Y-%m-%dT%H:%M:%SZ').strftime('%s'))
|
||||
releaseTimestamp = int(datetime.datetime.strptime(dateTimeStr, '%Y-%m-%dT%H:%M:%S%z').timestamp())
|
||||
|
||||
if realeaseTimestamp > buildTimestamp + 600:
|
||||
if releaseTimestamp > buildTimestamp + 600:
|
||||
mylog('none', ["[Version check] New version of the container available!"])
|
||||
newVersion = True
|
||||
else:
|
||||
|
||||
@@ -103,15 +103,12 @@ def importConfigs (db):
|
||||
conf.TIMEZONE = ccd('TIMEZONE', 'Europe/Berlin' , c_d, 'Time zone', 'text', '', 'General')
|
||||
conf.PLUGINS_KEEP_HIST = ccd('PLUGINS_KEEP_HIST', 250 , c_d, 'Keep history entries', 'integer', '', 'General')
|
||||
conf.PIALERT_WEB_PROTECTION = ccd('PIALERT_WEB_PROTECTION', False , c_d, 'Enable logon', 'boolean', '', 'General')
|
||||
conf.PIALERT_WEB_PASSWORD = ccd('PIALERT_WEB_PASSWORD', '8d969eef6ecad3c29a3a629280e686cf0c3f5d5a86aff3ca12020c923adc6c92' , c_d, 'Logon password', 'readonly', '', 'General')
|
||||
conf.INCLUDED_SECTIONS = ccd('INCLUDED_SECTIONS', ['new_devices', 'down_devices', 'events'] , c_d, 'Notify on', 'text.multiselect', "['new_devices', 'down_devices', 'events', 'plugins']", 'General')
|
||||
conf.PIALERT_WEB_PASSWORD = ccd('PIALERT_WEB_PASSWORD', '8d969eef6ecad3c29a3a629280e686cf0c3f5d5a86aff3ca12020c923adc6c92' , c_d, 'Logon password', 'readonly', '', 'General')
|
||||
conf.REPORT_DASHBOARD_URL = ccd('REPORT_DASHBOARD_URL', 'http://pi.alert/' , c_d, 'PiAlert URL', 'text', '', 'General')
|
||||
conf.DIG_GET_IP_ARG = ccd('DIG_GET_IP_ARG', '-4 myip.opendns.com @resolver1.opendns.com' , c_d, 'DIG arguments', 'text', '', 'General')
|
||||
conf.UI_LANG = ccd('UI_LANG', 'English' , c_d, 'Language Interface', 'text.select', "['English', 'German', 'Spanish']", 'General')
|
||||
conf.UI_PRESENCE = ccd('UI_PRESENCE', ['online', 'offline', 'archived'] , c_d, 'Include in presence', 'text.multiselect', "['online', 'offline', 'archived']", 'General')
|
||||
conf.DAYS_TO_KEEP_EVENTS = ccd('DAYS_TO_KEEP_EVENTS', 90 , c_d, 'Delete events days', 'integer', '', 'General')
|
||||
conf.HRS_TO_KEEP_NEWDEV = ccd('HRS_TO_KEEP_NEWDEV', 0 , c_d, 'Keep new devices for', 'integer', "0", 'General')
|
||||
conf.DBCLNP_NOTIFI_HIST = ccd('DBCLNP_NOTIFI_HIST', 100 , c_d, 'Keep notification', 'integer', "0", 'General')
|
||||
conf.HRS_TO_KEEP_NEWDEV = ccd('HRS_TO_KEEP_NEWDEV', 0 , c_d, 'Keep new devices for', 'integer', "0", 'General')
|
||||
conf.API_CUSTOM_SQL = ccd('API_CUSTOM_SQL', 'SELECT * FROM Devices WHERE dev_PresentLastScan = 0' , c_d, 'Custom endpoint', 'text', '', 'General')
|
||||
conf.NETWORK_DEVICE_TYPES = ccd('NETWORK_DEVICE_TYPES', ['AP', 'Gateway', 'Firewall', 'Hypervisor', 'Powerline', 'Switch', 'WLAN', 'PLC', 'Router','USB LAN Adapter', 'USB WIFI Adapter', 'Internet'] , c_d, 'Network device types', 'list', '', 'General')
|
||||
|
||||
@@ -251,6 +248,7 @@ def read_config_file(filename):
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
# DEPERECATED soonest after 3/3/2024
|
||||
# 🤔Idea/TODO: Check and compare versions/timestamps amd only perform a replacement if config/version older than...
|
||||
replacements = {
|
||||
r'\bREPORT_TO\b': 'SMTP_REPORT_TO',
|
||||
r'\bREPORT_FROM\b': 'SMTP_REPORT_FROM',
|
||||
@@ -259,7 +257,10 @@ replacements = {
|
||||
r'REPORT_NTFY=True': 'NTFY_RUN=\'on_notification\'',
|
||||
r'REPORT_WEBHOOK=True': 'WEBHOOK_RUN=\'on_notification\'',
|
||||
r'REPORT_PUSHSAFER=True': 'PUSHSAFER_RUN=\'on_notification\'',
|
||||
r'REPORT_MQTT=True': 'MQTT_RUN=\'on_notification\''
|
||||
r'REPORT_MQTT=True': 'MQTT_RUN=\'on_notification\'',
|
||||
r'PIHOLE_CMD=': 'PIHOLE_CMD_OLD=',
|
||||
r'\bINCLUDED_SECTIONS\b': 'NTFPRCS_INCLUDED_SECTIONS',
|
||||
r'\bDIG_GET_IP_ARG\b': 'INTRNT_DIG_GET_IP_ARG'
|
||||
}
|
||||
|
||||
def renameSettings(config_file):
|
||||
|
||||
@@ -187,7 +187,7 @@ def insert_events (db):
|
||||
eve_PendingAlertEmail)
|
||||
SELECT dev_MAC, dev_LastIP, '{startTime}', 'Device Down', '', 1
|
||||
FROM Devices
|
||||
WHERE dev_AlertDeviceDown = 1
|
||||
WHERE dev_AlertDeviceDown != 0
|
||||
AND dev_PresentLastScan = 1
|
||||
AND NOT EXISTS (SELECT 1 FROM CurrentScan
|
||||
WHERE dev_MAC = cur_MAC
|
||||
|
||||
@@ -1,13 +1,16 @@
|
||||
import datetime
|
||||
import json
|
||||
import uuid
|
||||
import socket
|
||||
import subprocess
|
||||
from json2table import convert
|
||||
|
||||
# PiAlert modules
|
||||
import conf
|
||||
import const
|
||||
from const import pialertPath, logPath, apiPath
|
||||
from logger import logResult, mylog, print_log
|
||||
from helper import timeNowTZ
|
||||
from helper import generate_mac_links, removeDuplicateNewLines, timeNowTZ, get_file_content, write_file, get_setting_value, get_timezone_offset
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
# Notification object handling
|
||||
@@ -34,11 +37,22 @@ class Notification_obj:
|
||||
|
||||
self.save()
|
||||
|
||||
# Create a new DB entry if new notiifcations available, otherwise skip
|
||||
def create(self, JSON, Text, HTML, Extra=""):
|
||||
# Method to override processing of notifications
|
||||
def on_before_create(self, JSON, Extra):
|
||||
|
||||
return JSON, Extra
|
||||
|
||||
|
||||
# Create a new DB entry if new notifications available, otherwise skip
|
||||
def create(self, JSON, Extra=""):
|
||||
|
||||
JSON, Extra = self.on_before_create(JSON, Extra)
|
||||
|
||||
# Write output data for debug
|
||||
write_file (logPath + '/report_output.json', json.dumps(JSON))
|
||||
|
||||
# Check if nothing to report, end
|
||||
if JSON["internet"] == [] and JSON["new_devices"] == [] and JSON["down_devices"] == [] and JSON["events"] == [] and JSON["plugins"] == []:
|
||||
if JSON["new_devices"] == [] and JSON["down_devices"] == [] and JSON["events"] == [] and JSON["plugins"] == []:
|
||||
self.HasNotifications = False
|
||||
else:
|
||||
self.HasNotifications = True
|
||||
@@ -48,12 +62,108 @@ class Notification_obj:
|
||||
self.DateTimePushed = ""
|
||||
self.Status = "new"
|
||||
self.JSON = JSON
|
||||
self.Text = Text
|
||||
self.HTML = HTML
|
||||
self.Text = ""
|
||||
self.HTML = ""
|
||||
self.PublishedVia = ""
|
||||
self.Extra = Extra
|
||||
|
||||
if self.HasNotifications:
|
||||
|
||||
|
||||
# if not notiStruc.json['data'] and not notiStruc.text and not notiStruc.html:
|
||||
# mylog('debug', '[Notification] notiStruc is empty')
|
||||
# else:
|
||||
# mylog('debug', ['[Notification] notiStruc:', json.dumps(notiStruc.__dict__, indent=4)])
|
||||
|
||||
Text = ""
|
||||
HTML = ""
|
||||
|
||||
|
||||
# Open text Template
|
||||
mylog('verbose', ['[Notification] Open text Template'])
|
||||
template_file = open(pialertPath + '/back/report_template.txt', 'r')
|
||||
mail_text = template_file.read()
|
||||
template_file.close()
|
||||
|
||||
# Open html Template
|
||||
mylog('verbose', ['[Notification] Open html Template'])
|
||||
|
||||
# select template type depoending if running latest version or an older one
|
||||
if conf.newVersionAvailable :
|
||||
template_file_path = '/back/report_template_new_version.html'
|
||||
else:
|
||||
template_file_path = '/back/report_template.html'
|
||||
|
||||
mylog('verbose', ['[Notification] Using template', template_file_path])
|
||||
template_file = open(pialertPath + template_file_path, 'r')
|
||||
mail_html = template_file.read()
|
||||
template_file.close()
|
||||
|
||||
|
||||
# Report "REPORT_DATE" in Header & footer
|
||||
timeFormated = timeNowTZ().strftime ('%Y-%m-%d %H:%M')
|
||||
mail_text = mail_text.replace ('<REPORT_DATE>', timeFormated)
|
||||
mail_html = mail_html.replace ('<REPORT_DATE>', timeFormated)
|
||||
|
||||
# Report "SERVER_NAME" in Header & footer
|
||||
mail_text = mail_text.replace ('<SERVER_NAME>', socket.gethostname() )
|
||||
mail_html = mail_html.replace ('<SERVER_NAME>', socket.gethostname() )
|
||||
|
||||
# Report "VERSION" in Header & footer
|
||||
VERSIONFILE = subprocess.check_output(['php', pialertPath + '/front/php/templates/version.php']).decode('utf-8')
|
||||
mail_text = mail_text.replace ('<VERSION_PIALERT>', VERSIONFILE)
|
||||
mail_html = mail_html.replace ('<VERSION_PIALERT>', VERSIONFILE)
|
||||
|
||||
# Report "BUILD" in Header & footer
|
||||
BUILDFILE = subprocess.check_output(['php', pialertPath + '/front/php/templates/build.php']).decode('utf-8')
|
||||
mail_text = mail_text.replace ('<BUILD_PIALERT>', BUILDFILE)
|
||||
mail_html = mail_html.replace ('<BUILD_PIALERT>', BUILDFILE)
|
||||
|
||||
# Start generating the TEXT & HTML notification messages
|
||||
html, text = construct_notifications(self.JSON, "new_devices")
|
||||
|
||||
mail_text = mail_text.replace ('<NEW_DEVICES_TABLE>', text + '\n')
|
||||
mail_html = mail_html.replace ('<NEW_DEVICES_TABLE>', html)
|
||||
mylog('verbose', ['[Notification] New Devices sections done.'])
|
||||
|
||||
html, text = construct_notifications(self.JSON, "down_devices")
|
||||
|
||||
|
||||
mail_text = mail_text.replace ('<DOWN_DEVICES_TABLE>', text + '\n')
|
||||
mail_html = mail_html.replace ('<DOWN_DEVICES_TABLE>', html)
|
||||
mylog('verbose', ['[Notification] Down Devices sections done.'])
|
||||
|
||||
html, text = construct_notifications(self.JSON, "events")
|
||||
|
||||
|
||||
mail_text = mail_text.replace ('<EVENTS_TABLE>', text + '\n')
|
||||
mail_html = mail_html.replace ('<EVENTS_TABLE>', html)
|
||||
mylog('verbose', ['[Notification] Events sections done.'])
|
||||
|
||||
|
||||
html, text = construct_notifications(self.JSON, "plugins")
|
||||
|
||||
mail_text = mail_text.replace ('<PLUGINS_TABLE>', text + '\n')
|
||||
mail_html = mail_html.replace ('<PLUGINS_TABLE>', html)
|
||||
|
||||
mylog('verbose', ['[Notification] Plugins sections done.'])
|
||||
|
||||
final_text = removeDuplicateNewLines(mail_text)
|
||||
|
||||
# Create clickable MAC links
|
||||
final_html = generate_mac_links (mail_html, conf.REPORT_DASHBOARD_URL + '/deviceDetails.php?mac=')
|
||||
|
||||
send_api(self.JSON, mail_text, mail_html)
|
||||
|
||||
# Write output data for debug
|
||||
write_file (logPath + '/report_output.txt', final_text)
|
||||
write_file (logPath + '/report_output.html', final_html)
|
||||
|
||||
mylog('minimal', ['[Notification] Udating API files'])
|
||||
|
||||
self.Text = final_text
|
||||
self.HTML = final_html
|
||||
|
||||
self.upsert()
|
||||
|
||||
return self
|
||||
@@ -107,8 +217,108 @@ class Notification_obj:
|
||||
|
||||
self.save()
|
||||
|
||||
|
||||
def clearPendingEmailFlag(self):
|
||||
|
||||
# Clean Pending Alert Events
|
||||
self.db.sql.execute ("""UPDATE Devices SET dev_LastNotification = ?
|
||||
WHERE dev_MAC IN (
|
||||
SELECT eve_MAC FROM Events
|
||||
WHERE eve_PendingAlertEmail = 1
|
||||
)
|
||||
""", (timeNowTZ(),) )
|
||||
|
||||
self.db.sql.execute ("""UPDATE Events SET eve_PendingAlertEmail = 0
|
||||
WHERE eve_PendingAlertEmail = 1
|
||||
AND eve_EventType !='Device Down' """)
|
||||
|
||||
# Clear down events flag after the reporting window passed
|
||||
self.db.sql.execute (f"""UPDATE Events SET eve_PendingAlertEmail = 0
|
||||
WHERE eve_PendingAlertEmail = 1
|
||||
AND eve_EventType =='Device Down'
|
||||
AND eve_DateTime < datetime('now', '-{get_setting_value('NTFPRCS_alert_down_time')} minutes', '{get_timezone_offset()}')
|
||||
""")
|
||||
|
||||
# clear plugin events
|
||||
self.db.sql.execute ("DELETE FROM Plugins_Events")
|
||||
|
||||
# DEBUG - print number of rows updated
|
||||
mylog('minimal', ['[Notification] Notifications changes: ', self.db.sql.rowcount])
|
||||
|
||||
self.save()
|
||||
|
||||
def save(self):
|
||||
# Commit changes
|
||||
self.db.commitDB()
|
||||
self.db.commitDB()
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
# Reporting
|
||||
#-------------------------------------------------------------------------------
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
def construct_notifications(JSON, section):
|
||||
|
||||
jsn = JSON[section]
|
||||
|
||||
# Return if empty
|
||||
if jsn == []:
|
||||
return '',''
|
||||
|
||||
tableTitle = JSON[section + "_meta"]["title"]
|
||||
headers = JSON[section + "_meta"]["columnNames"]
|
||||
|
||||
html = ''
|
||||
text = ''
|
||||
|
||||
table_attributes = {"style" : "border-collapse: collapse; font-size: 12px; color:#70707", "width" : "100%", "cellspacing" : 0, "cellpadding" : "3px", "bordercolor" : "#C0C0C0", "border":"1"}
|
||||
headerProps = "width='120px' style='color:white; font-size: 16px;' bgcolor='#64a0d6' "
|
||||
thProps = "width='120px' style='color:#F0F0F0' bgcolor='#64a0d6' "
|
||||
|
||||
build_direction = "TOP_TO_BOTTOM"
|
||||
text_line = '{}\t{}\n'
|
||||
|
||||
|
||||
if len(jsn) > 0:
|
||||
text = tableTitle + "\n---------\n"
|
||||
|
||||
# Convert a JSON into an HTML table
|
||||
html = convert({"data": jsn}, build_direction=build_direction, table_attributes=table_attributes)
|
||||
|
||||
# Cleanup the generated HTML table notification
|
||||
html = format_table(html, "data", headerProps, tableTitle).replace('<ul>','<ul style="list-style:none;padding-left:0">').replace("<td>null</td>", "<td></td>")
|
||||
|
||||
# prepare text-only message
|
||||
for device in jsn:
|
||||
for header in headers:
|
||||
padding = ""
|
||||
if len(header) < 4:
|
||||
padding = "\t"
|
||||
text += text_line.format ( header + ': ' + padding, device[header])
|
||||
text += '\n'
|
||||
|
||||
# Format HTML table headers
|
||||
for header in headers:
|
||||
html = format_table(html, header, thProps)
|
||||
|
||||
return html, text
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
def send_api(json_final, mail_text, mail_html):
|
||||
mylog('verbose', ['[Send API] Updating notification_* files in ', apiPath])
|
||||
|
||||
write_file(apiPath + 'notification_text.txt' , mail_text)
|
||||
write_file(apiPath + 'notification_text.html' , mail_html)
|
||||
write_file(apiPath + 'notification_json_final.json' , json.dumps(json_final))
|
||||
|
||||
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
# Replacing table headers
|
||||
def format_table (html, thValue, props, newThValue = ''):
|
||||
|
||||
if newThValue == '':
|
||||
newThValue = thValue
|
||||
|
||||
return html.replace("<th>"+thValue+"</th>", "<th "+props+" >"+newThValue+"</th>" )
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -241,7 +241,7 @@ def execute_plugin(db, plugin, pluginsState = plugins_state() ):
|
||||
if len(columns) == 9:
|
||||
# Create a tuple containing values to be inserted into the database.
|
||||
# Each value corresponds to a column in the table in the order of the columns.
|
||||
# must match the Plugins_Objects and Plugins_Events databse tables and can be used as input for the plugin_object_class.
|
||||
# must match the Plugins_Objects and Plugins_Events database tables and can be used as input for the plugin_object_class.
|
||||
sqlParams.append(
|
||||
(
|
||||
0, # "Index" placeholder
|
||||
@@ -281,7 +281,7 @@ def execute_plugin(db, plugin, pluginsState = plugins_state() ):
|
||||
if len(row) == 9 and (row[0] in ['','null']) == False :
|
||||
# Create a tuple containing values to be inserted into the database.
|
||||
# Each value corresponds to a column in the table in the order of the columns.
|
||||
# must match the Plugins_Objects and Plugins_Events databse tables and can be used as input for the plugin_object_class
|
||||
# must match the Plugins_Objects and Plugins_Events database tables and can be used as input for the plugin_object_class
|
||||
sqlParams.append(
|
||||
(
|
||||
0, # "Index" placeholder
|
||||
@@ -327,6 +327,8 @@ def execute_plugin(db, plugin, pluginsState = plugins_state() ):
|
||||
try:
|
||||
sql.execute ("ATTACH DATABASE '"+ fullSqlitePath +"' AS EXTERNAL_"+plugin["unique_prefix"])
|
||||
arr = db.get_sql_array (q)
|
||||
sql.execute ("DETACH DATABASE EXTERNAL_"+plugin["unique_prefix"])
|
||||
|
||||
except sqlite3.Error as e:
|
||||
mylog('none',[f'[Plugins] ⚠ ERROR: DB_PATH setting ({fullSqlitePath}) for plugin {plugin["unique_prefix"]}. Did you mount it correctly?'])
|
||||
mylog('none',[f'[Plugins] ⚠ ERROR: ATTACH DATABASE failed with SQL ERROR: ', e])
|
||||
@@ -337,7 +339,7 @@ def execute_plugin(db, plugin, pluginsState = plugins_state() ):
|
||||
if len(row) == 9 and (row[0] in ['','null']) == False :
|
||||
# Create a tuple containing values to be inserted into the database.
|
||||
# Each value corresponds to a column in the table in the order of the columns.
|
||||
# must match the Plugins_Objects and Plugins_Events databse tables and can be used as input for the plugin_object_class
|
||||
# must match the Plugins_Objects and Plugins_Events database tables and can be used as input for the plugin_object_class
|
||||
sqlParams.append((
|
||||
0, # "Index" placeholder
|
||||
plugin["unique_prefix"], # "Plugin"
|
||||
@@ -750,6 +752,9 @@ def check_and_run_user_event(db, pluginsState):
|
||||
pluginsState = handle_test(param, db, pluginsState)
|
||||
if event == 'run':
|
||||
pluginsState = handle_run(param, db, pluginsState)
|
||||
if event == 'update_api':
|
||||
# update API endpoints
|
||||
update_api(db, False, param.split(','))
|
||||
|
||||
# Clear the log file
|
||||
open(logFile, "w").close()
|
||||
@@ -784,7 +789,7 @@ def handle_test(runType, db, pluginsState):
|
||||
|
||||
# Create fake notification
|
||||
notification = Notification_obj(db)
|
||||
notificationObj = notification.create(sample_json, sample_txt, sample_html, "")
|
||||
notificationObj = notification.create(sample_json, "")
|
||||
|
||||
# Run test
|
||||
pluginsState = handle_run(runType, db, pluginsState)
|
||||
|
||||
@@ -12,103 +12,37 @@
|
||||
|
||||
import datetime
|
||||
import json
|
||||
import socket
|
||||
import subprocess
|
||||
import requests
|
||||
from json2table import convert
|
||||
|
||||
# pialert modules
|
||||
import conf
|
||||
import const
|
||||
from const import pialertPath, logPath, apiPath
|
||||
from helper import noti_obj, generate_mac_links, removeDuplicateNewLines, timeNowTZ, hide_email, updateState, get_file_content, write_file
|
||||
from helper import timeNowTZ, get_file_content, write_file, get_timezone_offset, get_setting_value
|
||||
from logger import logResult, mylog, print_log
|
||||
|
||||
|
||||
#===============================================================================
|
||||
# REPORTING
|
||||
#===============================================================================
|
||||
# create a json of the notifications to provide further integration options
|
||||
json_final = []
|
||||
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
def construct_notifications(db, sqlQuery, tableTitle, skipText = False, suppliedJsonStruct = None):
|
||||
|
||||
if suppliedJsonStruct is None and sqlQuery == "":
|
||||
return noti_obj("", "", "")
|
||||
|
||||
table_attributes = {"style" : "border-collapse: collapse; font-size: 12px; color:#70707", "width" : "100%", "cellspacing" : 0, "cellpadding" : "3px", "bordercolor" : "#C0C0C0", "border":"1"}
|
||||
headerProps = "width='120px' style='color:white; font-size: 16px;' bgcolor='#64a0d6' "
|
||||
thProps = "width='120px' style='color:#F0F0F0' bgcolor='#64a0d6' "
|
||||
|
||||
build_direction = "TOP_TO_BOTTOM"
|
||||
text_line = '{}\t{}\n'
|
||||
|
||||
if suppliedJsonStruct is None:
|
||||
json_obj = db.get_table_as_json(sqlQuery)
|
||||
else:
|
||||
json_obj = suppliedJsonStruct
|
||||
|
||||
jsn = json_obj.json
|
||||
html = ""
|
||||
text = ""
|
||||
|
||||
if len(jsn["data"]) > 0:
|
||||
text = tableTitle + "\n---------\n"
|
||||
|
||||
# Convert a JSON into an HTML table
|
||||
html = convert(jsn, build_direction=build_direction, table_attributes=table_attributes)
|
||||
|
||||
# Cleanup the generated HTML table notification
|
||||
html = format_table(html, "data", headerProps, tableTitle).replace('<ul>','<ul style="list-style:none;padding-left:0">').replace("<td>null</td>", "<td></td>")
|
||||
|
||||
headers = json_obj.columnNames
|
||||
|
||||
# prepare text-only message
|
||||
if skipText == False:
|
||||
|
||||
for device in jsn["data"]:
|
||||
for header in headers:
|
||||
padding = ""
|
||||
if len(header) < 4:
|
||||
padding = "\t"
|
||||
text += text_line.format ( header + ': ' + padding, device[header])
|
||||
text += '\n'
|
||||
|
||||
# Format HTML table headers
|
||||
for header in headers:
|
||||
html = format_table(html, header, thProps)
|
||||
|
||||
notiStruc = noti_obj(jsn, text, html)
|
||||
|
||||
|
||||
if not notiStruc.json['data'] and not notiStruc.text and not notiStruc.html:
|
||||
mylog('debug', '[Notification] notiStruc is empty')
|
||||
else:
|
||||
mylog('debug', ['[Notification] notiStruc:', json.dumps(notiStruc.__dict__, indent=4)])
|
||||
|
||||
return notiStruc
|
||||
|
||||
|
||||
def get_notifications (db):
|
||||
|
||||
sql = db.sql #TO-DO
|
||||
global mail_text, mail_html, json_final, partial_html, partial_txt, partial_json
|
||||
|
||||
deviceUrl = conf.REPORT_DASHBOARD_URL + '/deviceDetails.php?mac='
|
||||
plugins_report = False
|
||||
|
||||
|
||||
# Reporting section
|
||||
mylog('verbose', ['[Notification] Check if something to report'])
|
||||
|
||||
# prepare variables for JSON construction
|
||||
json_internet = []
|
||||
# prepare variables for JSON construction
|
||||
json_new_devices = []
|
||||
json_new_devices_meta = {}
|
||||
json_down_devices = []
|
||||
json_events = []
|
||||
json_ports = []
|
||||
json_down_devices_meta = {}
|
||||
json_events = []
|
||||
json_events_meta = {}
|
||||
json_plugins = []
|
||||
json_plugins_meta = {}
|
||||
|
||||
# Disable reporting on events for devices where reporting is disabled based on the MAC address
|
||||
sql.execute ("""UPDATE Events SET eve_PendingAlertEmail = 0
|
||||
@@ -121,192 +55,106 @@ def get_notifications (db):
|
||||
(
|
||||
SELECT dev_MAC FROM Devices WHERE dev_AlertDeviceDown = 0
|
||||
)""")
|
||||
|
||||
sections = get_setting_value('NTFPRCS_INCLUDED_SECTIONS')
|
||||
|
||||
# Open text Template
|
||||
mylog('verbose', ['[Notification] Open text Template'])
|
||||
template_file = open(pialertPath + '/back/report_template.txt', 'r')
|
||||
mail_text = template_file.read()
|
||||
template_file.close()
|
||||
mylog('verbose', ['[Notification] Included sections: ', sections ])
|
||||
|
||||
# Open html Template
|
||||
mylog('verbose', ['[Notification] Open html Template'])
|
||||
|
||||
|
||||
# select template type depoending if running latest version or an older one
|
||||
if conf.newVersionAvailable :
|
||||
template_file_path = '/back/report_template_new_version.html'
|
||||
else:
|
||||
template_file_path = '/back/report_template.html'
|
||||
|
||||
mylog('verbose', ['[Notification] Using template', template_file_path])
|
||||
template_file = open(pialertPath + template_file_path, 'r')
|
||||
|
||||
mail_html = template_file.read()
|
||||
template_file.close()
|
||||
|
||||
# Report "REPORT_DATE" in Header & footer
|
||||
timeFormated = timeNowTZ().strftime ('%Y-%m-%d %H:%M')
|
||||
mail_text = mail_text.replace ('<REPORT_DATE>', timeFormated)
|
||||
mail_html = mail_html.replace ('<REPORT_DATE>', timeFormated)
|
||||
|
||||
# Report "SERVER_NAME" in Header & footer
|
||||
mail_text = mail_text.replace ('<SERVER_NAME>', socket.gethostname() )
|
||||
mail_html = mail_html.replace ('<SERVER_NAME>', socket.gethostname() )
|
||||
|
||||
# Report "VERSION" in Header & footer
|
||||
VERSIONFILE = subprocess.check_output(['php', pialertPath + '/front/php/templates/version.php']).decode('utf-8')
|
||||
mail_text = mail_text.replace ('<VERSION_PIALERT>', VERSIONFILE)
|
||||
mail_html = mail_html.replace ('<VERSION_PIALERT>', VERSIONFILE)
|
||||
|
||||
# Report "BUILD" in Header & footer
|
||||
BUILDFILE = subprocess.check_output(['php', pialertPath + '/front/php/templates/build.php']).decode('utf-8')
|
||||
mail_text = mail_text.replace ('<BUILD_PIALERT>', BUILDFILE)
|
||||
mail_html = mail_html.replace ('<BUILD_PIALERT>', BUILDFILE)
|
||||
|
||||
mylog('verbose', ['[Notification] included sections: ', conf.INCLUDED_SECTIONS ])
|
||||
|
||||
if 'new_devices' in conf.INCLUDED_SECTIONS :
|
||||
if 'new_devices' in sections:
|
||||
# Compose New Devices Section
|
||||
sqlQuery = """SELECT eve_MAC as MAC, eve_DateTime as Datetime, dev_LastIP as IP, eve_EventType as "Event Type", dev_Name as "Device name", dev_Comments as Comments FROM Events_Devices
|
||||
sqlQuery = f"""SELECT eve_MAC as MAC, eve_DateTime as Datetime, dev_LastIP as IP, eve_EventType as "Event Type", dev_Name as "Device name", dev_Comments as Comments FROM Events_Devices
|
||||
WHERE eve_PendingAlertEmail = 1
|
||||
AND eve_EventType = 'New Device'
|
||||
ORDER BY eve_DateTime"""
|
||||
AND eve_EventType = 'New Device'
|
||||
{get_setting_value('NTFPRCS_new_dev_condition').replace('{s-quote}',"'")}
|
||||
ORDER BY eve_DateTime"""
|
||||
|
||||
notiStruc = construct_notifications(db, sqlQuery, "New devices")
|
||||
mylog('debug', ['[Notification] new_devices SQL query: ', sqlQuery ])
|
||||
|
||||
# collect "new_devices" for the json
|
||||
json_new_devices = notiStruc.json["data"]
|
||||
# Get the events as JSON
|
||||
json_obj = db.get_table_as_json(sqlQuery)
|
||||
|
||||
mail_text = mail_text.replace ('<NEW_DEVICES_TABLE>', notiStruc.text + '\n')
|
||||
mail_html = mail_html.replace ('<NEW_DEVICES_TABLE>', notiStruc.html)
|
||||
mylog('verbose', ['[Notification] New Devices sections done.'])
|
||||
json_new_devices_meta = {
|
||||
"title": "New devices",
|
||||
"columnNames": json_obj.columnNames
|
||||
}
|
||||
|
||||
if 'down_devices' in conf.INCLUDED_SECTIONS :
|
||||
# Compose Devices Down Section
|
||||
sqlQuery = """SELECT eve_MAC as MAC, eve_DateTime as Datetime, dev_LastIP as IP, eve_EventType as "Event Type", dev_Name as "Device name", dev_Comments as Comments FROM Events_Devices
|
||||
WHERE eve_PendingAlertEmail = 1
|
||||
AND eve_EventType = 'Device Down'
|
||||
ORDER BY eve_DateTime"""
|
||||
json_new_devices = json_obj.json["data"]
|
||||
|
||||
notiStruc = construct_notifications(db, sqlQuery, "Down devices")
|
||||
if 'down_devices' in sections:
|
||||
# Compose Devices Down Section
|
||||
# - select only Down Alerts with pending email of devices that didn't reconnect within the specified time window
|
||||
sqlQuery = f"""
|
||||
SELECT dev_Name, eve_MAC, dev_Vendor, eve_IP, eve_DateTime, eve_EventType
|
||||
FROM Events_Devices AS down_events
|
||||
WHERE eve_PendingAlertEmail = 1
|
||||
AND down_events.eve_EventType = 'Device Down'
|
||||
AND eve_DateTime < datetime('now', '-{get_setting_value('NTFPRCS_alert_down_time')} minutes', '{get_timezone_offset()}')
|
||||
AND NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM Events AS connected_events
|
||||
WHERE connected_events.eve_MAC = down_events.eve_MAC
|
||||
AND connected_events.eve_EventType = 'Connected'
|
||||
AND connected_events.eve_DateTime > down_events.eve_DateTime
|
||||
)
|
||||
ORDER BY down_events.eve_DateTime;
|
||||
"""
|
||||
|
||||
# Get the events as JSON
|
||||
json_obj = db.get_table_as_json(sqlQuery)
|
||||
|
||||
# collect "down_devices" for the json
|
||||
json_down_devices = notiStruc.json["data"]
|
||||
json_down_devices_meta = {
|
||||
"title": "Down devices",
|
||||
"columnNames": json_obj.columnNames
|
||||
}
|
||||
json_down_devices = json_obj.json["data"]
|
||||
|
||||
mail_text = mail_text.replace ('<DOWN_DEVICES_TABLE>', notiStruc.text + '\n')
|
||||
mail_html = mail_html.replace ('<DOWN_DEVICES_TABLE>', notiStruc.html)
|
||||
mylog('verbose', ['[Notification] Down Devices sections done.'])
|
||||
|
||||
if 'events' in conf.INCLUDED_SECTIONS :
|
||||
if 'events' in sections:
|
||||
# Compose Events Section
|
||||
sqlQuery = """SELECT eve_MAC as MAC, eve_DateTime as Datetime, dev_LastIP as IP, eve_EventType as "Event Type", dev_Name as "Device name", dev_Comments as Comments FROM Events_Devices
|
||||
sqlQuery = f"""SELECT eve_MAC as MAC, eve_DateTime as Datetime, dev_LastIP as IP, eve_EventType as "Event Type", dev_Name as "Device name", dev_Comments as Comments FROM Events_Devices
|
||||
WHERE eve_PendingAlertEmail = 1
|
||||
AND eve_EventType IN ('Connected','Disconnected',
|
||||
'IP Changed')
|
||||
ORDER BY eve_DateTime"""
|
||||
'IP Changed')
|
||||
{get_setting_value('NTFPRCS_event_condition').replace('{s-quote}',"'")}
|
||||
ORDER BY eve_DateTime"""
|
||||
|
||||
notiStruc = construct_notifications(db, sqlQuery, "Events")
|
||||
mylog('debug', ['[Notification] events SQL query: ', sqlQuery ])
|
||||
|
||||
# Get the events as JSON
|
||||
json_obj = db.get_table_as_json(sqlQuery)
|
||||
|
||||
# collect "events" for the json
|
||||
json_events = notiStruc.json["data"]
|
||||
json_events_meta = {
|
||||
"title": "Events",
|
||||
"columnNames": json_obj.columnNames
|
||||
}
|
||||
json_events = json_obj.json["data"]
|
||||
|
||||
mail_text = mail_text.replace ('<EVENTS_TABLE>', notiStruc.text + '\n')
|
||||
mail_html = mail_html.replace ('<EVENTS_TABLE>', notiStruc.html)
|
||||
mylog('verbose', ['[Notification] Events sections done.'])
|
||||
|
||||
if 'plugins' in conf.INCLUDED_SECTIONS:
|
||||
if 'plugins' in sections:
|
||||
# Compose Plugins Section
|
||||
sqlQuery = """SELECT Plugin, Object_PrimaryId, Object_SecondaryId, DateTimeChanged, Watched_Value1, Watched_Value2, Watched_Value3, Watched_Value4, Status from Plugins_Events"""
|
||||
sqlQuery = """SELECT Plugin, Object_PrimaryId, Object_SecondaryId, DateTimeChanged, Watched_Value1, Watched_Value2, Watched_Value3, Watched_Value4, Status from Plugins_Events"""
|
||||
|
||||
# Get the events as JSON
|
||||
json_obj = db.get_table_as_json(sqlQuery)
|
||||
|
||||
notiStruc = construct_notifications(db, sqlQuery, "Plugins")
|
||||
json_plugins_meta = {
|
||||
"title": "Plugins",
|
||||
"columnNames": json_obj.columnNames
|
||||
}
|
||||
json_plugins = json_obj.json["data"]
|
||||
|
||||
# collect "plugins" for the json
|
||||
json_plugins = notiStruc.json["data"]
|
||||
|
||||
mail_text = mail_text.replace ('<PLUGINS_TABLE>', notiStruc.text + '\n')
|
||||
mail_html = mail_html.replace ('<PLUGINS_TABLE>', notiStruc.html)
|
||||
|
||||
# check if we need to report something
|
||||
plugins_report = len(json_plugins) > 0
|
||||
mylog('verbose', ['[Notification] Plugins sections done.'])
|
||||
|
||||
final_json = {
|
||||
"internet": json_internet,
|
||||
final_json = {
|
||||
"new_devices": json_new_devices,
|
||||
"new_devices_meta": json_new_devices_meta,
|
||||
"down_devices": json_down_devices,
|
||||
"down_devices_meta": json_down_devices_meta,
|
||||
"events": json_events,
|
||||
"events_meta": json_events_meta,
|
||||
"plugins": json_plugins,
|
||||
}
|
||||
"plugins_meta": json_plugins_meta,
|
||||
}
|
||||
|
||||
final_text = removeDuplicateNewLines(mail_text)
|
||||
return final_json
|
||||
|
||||
# Create clickable MAC links
|
||||
final_html = generate_mac_links (mail_html, deviceUrl)
|
||||
|
||||
# Write output emails for debug
|
||||
write_file (logPath + '/report_output.json', json.dumps(final_json))
|
||||
write_file (logPath + '/report_output.txt', final_text)
|
||||
write_file (logPath + '/report_output.html', final_html)
|
||||
|
||||
mylog('minimal', ['[Notification] Udating API files'])
|
||||
send_api()
|
||||
|
||||
return noti_obj(final_json, final_text, final_html)
|
||||
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
# Replacing table headers
|
||||
def format_table (html, thValue, props, newThValue = ''):
|
||||
|
||||
if newThValue == '':
|
||||
newThValue = thValue
|
||||
|
||||
return html.replace("<th>"+thValue+"</th>", "<th "+props+" >"+newThValue+"</th>" )
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
def format_report_section (pActive, pSection, pTable, pText, pHTML):
|
||||
|
||||
|
||||
# Replace section text
|
||||
if pActive :
|
||||
conf.mail_text = conf.mail_text.replace ('<'+ pTable +'>', pText)
|
||||
conf.mail_html = conf.mail_html.replace ('<'+ pTable +'>', pHTML)
|
||||
|
||||
conf.mail_text = remove_tag (conf.mail_text, pSection)
|
||||
conf.mail_html = remove_tag (conf.mail_html, pSection)
|
||||
else:
|
||||
conf.mail_text = remove_section (conf.mail_text, pSection)
|
||||
conf.mail_html = remove_section (conf.mail_html, pSection)
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
def remove_section (pText, pSection):
|
||||
# Search section into the text
|
||||
if pText.find ('<'+ pSection +'>') >=0 \
|
||||
and pText.find ('</'+ pSection +'>') >=0 :
|
||||
# return text without the section
|
||||
return pText[:pText.find ('<'+ pSection+'>')] + \
|
||||
pText[pText.find ('</'+ pSection +'>') + len (pSection) +3:]
|
||||
else :
|
||||
# return all text
|
||||
return pText
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
def remove_tag (pText, pTag):
|
||||
# return text without the tag
|
||||
return pText.replace ('<'+ pTag +'>','').replace ('</'+ pTag +'>','')
|
||||
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
# Reporting
|
||||
#-------------------------------------------------------------------------------
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
def send_api():
|
||||
mylog('verbose', ['[Send API] Updating notification_* files in ', apiPath])
|
||||
|
||||
write_file(apiPath + 'notification_text.txt' , mail_text)
|
||||
write_file(apiPath + 'notification_text.html' , mail_html)
|
||||
write_file(apiPath + 'notification_json_final.json' , json.dumps(json_final))
|
||||
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
@@ -314,7 +162,8 @@ def skip_repeated_notifications (db):
|
||||
|
||||
# Skip repeated notifications
|
||||
# due strfime : Overflow --> use "strftime / 60"
|
||||
mylog('verbose','[Skip Repeated Notifications] Skip Repeated start')
|
||||
mylog('verbose','[Skip Repeated Notifications] Skip Repeated')
|
||||
|
||||
db.sql.execute ("""UPDATE Events SET eve_PendingAlertEmail = 0
|
||||
WHERE eve_PendingAlertEmail = 1 AND eve_MAC IN
|
||||
(
|
||||
@@ -326,7 +175,7 @@ def skip_repeated_notifications (db):
|
||||
(strftime('%s','now','localtime')/60 )
|
||||
)
|
||||
""" )
|
||||
mylog('verbose','[Skip Repeated Notifications] Skip Repeated end')
|
||||
|
||||
|
||||
db.commitDB()
|
||||
|
||||
|
||||
Reference in New Issue
Block a user