diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml
index 8548df8a..4a17664c 100644
--- a/.github/workflows/codeql-analysis.yml
+++ b/.github/workflows/codeql-analysis.yml
@@ -29,7 +29,7 @@ jobs:
 
     steps:
     - name: Checkout repository
-      uses: actions/checkout@v3
+      uses: actions/checkout@v4
 
     # Initializes the CodeQL tools for scanning.
     - name: Initialize CodeQL
diff --git a/.github/workflows/pkgbuild.yml b/.github/workflows/pkgbuild.yml
index 2ff5f060..b26966e2 100644
--- a/.github/workflows/pkgbuild.yml
+++ b/.github/workflows/pkgbuild.yml
@@ -49,7 +49,7 @@ jobs:
     steps:
 
     - name: Acquire sources
-      uses: actions/checkout@v3
+      uses: actions/checkout@v4
 
     - name: Build package
       run: |
diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml
index be65f743..ef302ac6 100644
--- a/.github/workflows/tests.yml
+++ b/.github/workflows/tests.yml
@@ -75,7 +75,7 @@ jobs:
     steps:
 
     - name: Acquire sources
-      uses: actions/checkout@v3
+      uses: actions/checkout@v4
 
     - name: Install prerequisites (Linux)
       if: runner.os == 'Linux'
@@ -137,7 +137,7 @@ jobs:
         coverage report
 
     - name: Upload coverage data
-      uses: codecov/codecov-action@v3
+      uses: codecov/codecov-action@v4
       with:
         files: ./coverage.xml
         fail_ci_if_error: false
diff --git a/README.md b/README.md
index 429331b8..6ffebaec 100644
--- a/README.md
+++ b/README.md
@@ -534,9 +534,121 @@ aobj.add('foobar://')
 # Send our notification out through our foobar://
 aobj.notify("test")
 ```
-
 You can read more about creating your own custom notifications and/or hooks [here](https://github.com/caronc/apprise/wiki/decorator_notify).
 
+# Persistent Storage
+
+Persistent storage allows Apprise to cache re-occurring actions optionaly to disk. This can greatly reduce the overhead used to send a notification.
+
+There are 3 options Apprise can operate using this:
+1. `AUTO`:  Flush any gathered data for persistent storage on demand.  This option is incredibly light weight.  This is the default behavior for all CLI usage. Content can be manually flushed to disk using this option as well should a developer choose to do so.  The CLI uses this option by default and only writes anything accumulated to disk after all of it's notifications have completed.
+1. `FLUSH`: Flushes any gathered data for persistent storage as often as it is acquired.
+1. `MEMORY`: Only store information in memory, never write to disk.  This is the option one would set if they simply wish to disable Persistent Storage entirely.  By default this is the mode used by the API and is at the developers discretion to enable one of the other options.
+
+## CLI Persistent Storage Commands
+Persistent storage is set to `AUTO` mode by default.
+
+Specifying the keyword `storage` will assume that all subseqent calls are related to the storage subsection of Apprise.
+```bash
+# List all of the occupied space used by Apprise's Persistent Storage:
+apprise storage list
+
+# list is the default option, so the following does the same thing:
+apprise storage
+
+# You can prune all of your storage older then 30 days
+# and not accessed for this period like so:
+apprise storage prune
+
+# You can do a hard reset (and wipe all persistent storage) with:
+apprise storage clean
+
+```
+
+You can also filter your results by adding tags and/or URL Identifiers.  When you get a listing (`apprise storage list`), you may see:
+```
+   # example output of 'apprise storage list':
+   1. f7077a65                                             0.00B    unused
+      - matrixs://abcdef:****@synapse.example12.com/%23general?image=no&mode=off&version=3&msgtype...
+      tags: team
+
+   2. 0e873a46                                            81.10B    active
+      - tgram://W...U//?image=False&detect=yes&silent=no&preview=no&content=before&mdv=v1&format=m...
+      tags: personal
+
+   3. abcd123                                             12.00B    stale
+
+```
+The states are:
+ - `unused`: This plugin has not commited anything to disk for reuse/cache purposes
+ - `active`: This plugin has written content to disk.  Or at the very least, it has prepared a persistent storage location it can write into.
+ - `stale`: The system detected a location where a URL may have possibly written to in the past, but there is nothing linking to it using the URLs provided.  It is likely wasting space or is no longer of any use.
+
+You can use this information to filter your results by specifying _URL ID_ values after your command.  For example:
+```bash
+# The below commands continue with the example already identified above
+# the following would match abcd123 (even though just ab was provided)
+# The output would only list the 'stale' entry above
+apprise storage list ab
+
+# knowing our filter is safe, we could remove it
+# the below command would not obstruct our other to URLs and would only
+# remove our stale one:
+apprise storage clean ab
+
+# Entries can be filtered by tag as well:
+apprise storage list --tag=team
+
+# You can match on multiple URL ID's as well:
+# The followin would actually match the URL ID's of 1. and .2 above
+apprise storage list f 0
+```
+
+For more information on persistent storage, [visit here](https://github.com/caronc/apprise/wiki/persistent_storage).
+
+
+## API Persistent Storage Commands
+By default, no persistent storage is set to be in `MEMORY` mode for those building from within the Apprise API.
+It's at the developers discretion to enable it. But should you choose to do so, it's as easy as including the information in the `AppriseAsset()` object prior to the initialization of your `Apprise()` instance.
+
+For example:
+```python
+from apprise import Apprise
+from apprise import AppriseAsset
+from apprise import PersistentStoreMode
+
+# Prepare a location the persistent storage can write to
+# This immediately assumes you wish to write in AUTO mode
+asset = AppriseAsset(storage_path="/path/to/save/data")
+
+# If you want to be more explicit and set more options, then
+# you may do the following
+asset = AppriseAsset(
+    # Set our storage path directory (minimum requirement to enable it)
+    storage_path="/path/to/save/data",
+
+    # Set the mode... the options are:
+    # 1. PersistentStoreMode.MEMORY
+    #       - disable persistent storage from writing to disk
+    # 2. PersistentStoreMode.AUTO
+    #       - write to disk on demand
+    # 3. PersistentStoreMode.FLUSH
+    #       - write to disk always and often
+    storage_mode=PersistentStoreMode.FLUSH
+
+    # the URL IDs are by default 8 characters in length, there is
+    # really no reason to change this.  You can increase/decrease
+    # it's value here.  Must be > 2; default is 8 if not specified
+    storage_idlen=6,
+)
+
+# Now that we've got our asset, we just work with our Apprise object as we
+# normally do
+aobj = Apprise(asset=asset)
+```
+
+For more information on persistent storage, [visit here](https://github.com/caronc/apprise/wiki/persistent_storage).
+
 # Want To Learn More?
 
 If you're interested in reading more about this and other methods on how to customize your own notifications, please check out the following links:
@@ -545,6 +657,7 @@ If you're interested in reading more about this and other methods on how to cust
 * πŸ”§ [Troubleshooting](https://github.com/caronc/apprise/wiki/Troubleshooting)
 * βš™οΈ [Configuration File Help](https://github.com/caronc/apprise/wiki/config)
 * ⚑ [Create Your Own Custom Notifications](https://github.com/caronc/apprise/wiki/decorator_notify)
+* πŸ’Ύ [Persistent Storage](https://github.com/caronc/apprise/wiki/persistent_storage)
 * 🌎 [Apprise API/Web Interface](https://github.com/caronc/apprise-api)
 * πŸŽ‰ [Showcase](https://github.com/caronc/apprise/wiki/showcase)
 
diff --git a/apprise/__init__.py b/apprise/__init__.py
index 7bc19efe..112e3ff9 100644
--- a/apprise/__init__.py
+++ b/apprise/__init__.py
@@ -48,16 +48,20 @@ from .common import ContentIncludeMode
 from .common import CONTENT_INCLUDE_MODES
 from .common import ContentLocation
 from .common import CONTENT_LOCATIONS
+from .common import PersistentStoreMode
+from .common import PERSISTENT_STORE_MODES
 
 from .url import URLBase
 from .url import PrivacyMode
 from .plugins.base import NotifyBase
 from .config.base import ConfigBase
 from .attachment.base import AttachBase
+from . import exception
 
 from .apprise import Apprise
 from .locale import AppriseLocale
 from .asset import AppriseAsset
+from .persistent_store import PersistentStore
 from .apprise_config import AppriseConfig
 from .apprise_attachment import AppriseAttachment
 from .manager_attachment import AttachmentManager
@@ -77,6 +81,10 @@ __all__ = [
     # Core
     'Apprise', 'AppriseAsset', 'AppriseConfig', 'AppriseAttachment', 'URLBase',
     'NotifyBase', 'ConfigBase', 'AttachBase', 'AppriseLocale',
+    'PersistentStore',
+
+    # Exceptions
+    'exception',
 
     # Reference
     'NotifyType', 'NotifyImageSize', 'NotifyFormat', 'OverflowMode',
@@ -84,6 +92,7 @@ __all__ = [
     'ConfigFormat', 'CONFIG_FORMATS',
     'ContentIncludeMode', 'CONTENT_INCLUDE_MODES',
     'ContentLocation', 'CONTENT_LOCATIONS',
+    'PersistentStoreMode', 'PERSISTENT_STORE_MODES',
     'PrivacyMode',
 
     # Managers
diff --git a/apprise/asset.py b/apprise/asset.py
index c0fab9c0..2f08a666 100644
--- a/apprise/asset.py
+++ b/apprise/asset.py
@@ -33,6 +33,7 @@ from os.path import dirname
 from os.path import isfile
 from os.path import abspath
 from .common import NotifyType
+from .common import PersistentStoreMode
 from .manager_plugins import NotificationManager
 
 
@@ -157,6 +158,22 @@ class AppriseAsset:
     # By default, no paths are scanned.
     __plugin_paths = []
 
+    # Optionally set the location of the persistent storage
+    # By default there is no path and thus persistent storage is not used
+    __storage_path = None
+
+    # Optionally define the default salt to apply to all persistent storage
+    # namespace generation (unless over-ridden)
+    __storage_salt = b''
+
+    # Optionally define the namespace length of the directories created by
+    # the storage. If this is set to zero, then the length is pre-determined
+    # by the generator (sha1, md5, sha256, etc)
+    __storage_idlen = 8
+
+    # Set storage to auto
+    __storage_mode = PersistentStoreMode.AUTO
+
     # All internal/system flags are prefixed with an underscore (_)
     # These can only be initialized using Python libraries and are not picked
     # up from (yaml) configuration files (if set)
@@ -171,7 +188,9 @@ class AppriseAsset:
     # A unique identifer we can use to associate our calling source
     _uid = str(uuid4())
 
-    def __init__(self, plugin_paths=None, **kwargs):
+    def __init__(self, plugin_paths=None, storage_path=None,
+                 storage_mode=None, storage_salt=None,
+                 storage_idlen=None, **kwargs):
         """
         Asset Initialization
 
@@ -187,8 +206,49 @@ class AppriseAsset:
 
         if plugin_paths:
             # Load any decorated modules if defined
+            self.__plugin_paths = plugin_paths
             N_MGR.module_detection(plugin_paths)
 
+        if storage_path:
+            # Define our persistent storage path
+            self.__storage_path = storage_path
+
+        if storage_mode:
+            # Define how our persistent storage behaves
+            self.__storage_mode = storage_mode
+
+        if isinstance(storage_idlen, int):
+            # Define the number of characters utilized from our namespace lengh
+            if storage_idlen < 0:
+                # Unsupported type
+                raise ValueError(
+                    'AppriseAsset storage_idlen(): Value must '
+                    'be an integer and > 0')
+
+            # Store value
+            self.__storage_idlen = storage_idlen
+
+        if storage_salt is not None:
+            # Define the number of characters utilized from our namespace lengh
+
+            if isinstance(storage_salt, bytes):
+                self.__storage_salt = storage_salt
+
+            elif isinstance(storage_salt, str):
+                try:
+                    self.__storage_salt = storage_salt.encode(self.encoding)
+
+                except UnicodeEncodeError:
+                    # Bad data; don't pass it along
+                    raise ValueError(
+                        'AppriseAsset namespace_salt(): '
+                        'Value provided could not be encoded')
+
+            else:  # Unsupported
+                raise ValueError(
+                    'AppriseAsset namespace_salt(): Value provided must be '
+                    'string or bytes object')
+
     def color(self, notify_type, color_type=None):
         """
         Returns an HTML mapped color based on passed in notify type
@@ -356,3 +416,40 @@ class AppriseAsset:
 
         """
         return int(value.lstrip('#'), 16)
+
+    @property
+    def plugin_paths(self):
+        """
+        Return the plugin paths defined
+        """
+        return self.__plugin_paths
+
+    @property
+    def storage_path(self):
+        """
+        Return the persistent storage path defined
+        """
+        return self.__storage_path
+
+    @property
+    def storage_mode(self):
+        """
+        Return the persistent storage mode defined
+        """
+
+        return self.__storage_mode
+
+    @property
+    def storage_salt(self):
+        """
+        Return the provided namespace salt; this is always of type bytes
+        """
+        return self.__storage_salt
+
+    @property
+    def storage_idlen(self):
+        """
+        Return the persistent storage id length
+        """
+
+        return self.__storage_idlen
diff --git a/apprise/attachment/file.py b/apprise/attachment/file.py
index 88d8f6e1..e24e1fbe 100644
--- a/apprise/attachment/file.py
+++ b/apprise/attachment/file.py
@@ -29,6 +29,7 @@
 import re
 import os
 from .base import AttachBase
+from ..utils import path_decode
 from ..common import ContentLocation
 from ..locale import gettext_lazy as _
 
@@ -57,7 +58,10 @@ class AttachFile(AttachBase):
 
         # Store path but mark it dirty since we have not performed any
         # verification at this point.
-        self.dirty_path = os.path.expanduser(path)
+        self.dirty_path = path_decode(path)
+
+        # Track our file as it was saved
+        self.__original_path = os.path.normpath(path)
         return
 
     def url(self, privacy=False, *args, **kwargs):
@@ -77,7 +81,7 @@ class AttachFile(AttachBase):
             params['name'] = self._name
 
         return 'file://{path}{params}'.format(
-            path=self.quote(self.dirty_path),
+            path=self.quote(self.__original_path),
             params='?{}'.format(self.urlencode(params, safe='/'))
             if params else '',
         )
diff --git a/apprise/cli.py b/apprise/cli.py
index ac355a38..92dca428 100644
--- a/apprise/cli.py
+++ b/apprise/cli.py
@@ -27,26 +27,27 @@
 # POSSIBILITY OF SUCH DAMAGE.
 
 import click
+import textwrap
 import logging
 import platform
 import sys
 import os
+import shutil
 import re
 
 from os.path import isfile
 from os.path import exists
-from os.path import expanduser
-from os.path import expandvars
 
-from . import NotifyType
-from . import NotifyFormat
 from . import Apprise
 from . import AppriseAsset
 from . import AppriseConfig
+from . import PersistentStore
 
-from .utils import parse_list
+from .utils import dir_size, bytes_to_str, parse_list, path_decode
 from .common import NOTIFY_TYPES
 from .common import NOTIFY_FORMATS
+from .common import PERSISTENT_STORE_MODES
+from .common import PersistentStoreState
 from .common import ContentLocation
 from .logger import logger
 
@@ -104,67 +105,94 @@ DEFAULT_PLUGIN_PATHS = (
     '/var/lib/apprise/plugins',
 )
 
+#
+# Persistent Storage
+#
+DEFAULT_STORAGE_PATH = '~/.local/share/apprise/cache'
+
 # Detect Windows
 if platform.system() == 'Windows':
     # Default Config Search Path for Windows Users
     DEFAULT_CONFIG_PATHS = (
-        expandvars('%APPDATA%\\Apprise\\apprise'),
-        expandvars('%APPDATA%\\Apprise\\apprise.conf'),
-        expandvars('%APPDATA%\\Apprise\\apprise.yml'),
-        expandvars('%APPDATA%\\Apprise\\apprise.yaml'),
-        expandvars('%LOCALAPPDATA%\\Apprise\\apprise'),
-        expandvars('%LOCALAPPDATA%\\Apprise\\apprise.conf'),
-        expandvars('%LOCALAPPDATA%\\Apprise\\apprise.yml'),
-        expandvars('%LOCALAPPDATA%\\Apprise\\apprise.yaml'),
+        '%APPDATA%\\Apprise\\apprise',
+        '%APPDATA%\\Apprise\\apprise.conf',
+        '%APPDATA%\\Apprise\\apprise.yml',
+        '%APPDATA%\\Apprise\\apprise.yaml',
+        '%LOCALAPPDATA%\\Apprise\\apprise',
+        '%LOCALAPPDATA%\\Apprise\\apprise.conf',
+        '%LOCALAPPDATA%\\Apprise\\apprise.yml',
+        '%LOCALAPPDATA%\\Apprise\\apprise.yaml',
 
         #
         # Global Support
         #
 
         # C:\ProgramData\Apprise
-        expandvars('%ALLUSERSPROFILE%\\Apprise\\apprise'),
-        expandvars('%ALLUSERSPROFILE%\\Apprise\\apprise.conf'),
-        expandvars('%ALLUSERSPROFILE%\\Apprise\\apprise.yml'),
-        expandvars('%ALLUSERSPROFILE%\\Apprise\\apprise.yaml'),
+        '%ALLUSERSPROFILE%\\Apprise\\apprise',
+        '%ALLUSERSPROFILE%\\Apprise\\apprise.conf',
+        '%ALLUSERSPROFILE%\\Apprise\\apprise.yml',
+        '%ALLUSERSPROFILE%\\Apprise\\apprise.yaml',
 
         # C:\Program Files\Apprise
-        expandvars('%PROGRAMFILES%\\Apprise\\apprise'),
-        expandvars('%PROGRAMFILES%\\Apprise\\apprise.conf'),
-        expandvars('%PROGRAMFILES%\\Apprise\\apprise.yml'),
-        expandvars('%PROGRAMFILES%\\Apprise\\apprise.yaml'),
+        '%PROGRAMFILES%\\Apprise\\apprise',
+        '%PROGRAMFILES%\\Apprise\\apprise.conf',
+        '%PROGRAMFILES%\\Apprise\\apprise.yml',
+        '%PROGRAMFILES%\\Apprise\\apprise.yaml',
 
         # C:\Program Files\Common Files
-        expandvars('%COMMONPROGRAMFILES%\\Apprise\\apprise'),
-        expandvars('%COMMONPROGRAMFILES%\\Apprise\\apprise.conf'),
-        expandvars('%COMMONPROGRAMFILES%\\Apprise\\apprise.yml'),
-        expandvars('%COMMONPROGRAMFILES%\\Apprise\\apprise.yaml'),
+        '%COMMONPROGRAMFILES%\\Apprise\\apprise',
+        '%COMMONPROGRAMFILES%\\Apprise\\apprise.conf',
+        '%COMMONPROGRAMFILES%\\Apprise\\apprise.yml',
+        '%COMMONPROGRAMFILES%\\Apprise\\apprise.yaml',
     )
 
     # Default Plugin Search Path for Windows Users
     DEFAULT_PLUGIN_PATHS = (
-        expandvars('%APPDATA%\\Apprise\\plugins'),
-        expandvars('%LOCALAPPDATA%\\Apprise\\plugins'),
+        '%APPDATA%\\Apprise\\plugins',
+        '%LOCALAPPDATA%\\Apprise\\plugins',
 
         #
         # Global Support
         #
 
         # C:\ProgramData\Apprise\plugins
-        expandvars('%ALLUSERSPROFILE%\\Apprise\\plugins'),
+        '%ALLUSERSPROFILE%\\Apprise\\plugins',
         # C:\Program Files\Apprise\plugins
-        expandvars('%PROGRAMFILES%\\Apprise\\plugins'),
+        '%PROGRAMFILES%\\Apprise\\plugins',
         # C:\Program Files\Common Files
-        expandvars('%COMMONPROGRAMFILES%\\Apprise\\plugins'),
+        '%COMMONPROGRAMFILES%\\Apprise\\plugins',
     )
 
+    #
+    # Persistent Storage
+    #
+    DEFAULT_STORAGE_PATH = '%APPDATA%/Apprise/cache'
 
-def print_help_msg(command):
-    """
-    Prints help message when -h or --help is specified.
 
+class PersistentStorageMode:
     """
-    with click.Context(command) as ctx:
-        click.echo(command.get_help(ctx))
+    Persistent Storage Modes
+    """
+    # List all detected configuration loaded
+    LIST = 'list'
+
+    # Prune persistent storage based on age
+    PRUNE = 'prune'
+
+    # Reset all (reguardless of age)
+    CLEAR = 'clear'
+
+
+# Define the types in a list for validation purposes
+PERSISTENT_STORAGE_MODES = (
+    PersistentStorageMode.LIST,
+    PersistentStorageMode.PRUNE,
+    PersistentStorageMode.CLEAR,
+)
+
+if os.environ.get('APPRISE_STORAGE', '').strip():
+    # Over-ride Default Storage Path
+    DEFAULT_STORAGE_PATH = os.environ.get('APPRISE_STORAGE')
 
 
 def print_version_msg():
@@ -180,7 +208,106 @@ def print_version_msg():
     click.echo('\n'.join(result))
 
 
-@click.command(context_settings=CONTEXT_SETTINGS)
+class CustomHelpCommand(click.Command):
+    def format_help(self, ctx, formatter):
+        # Custom help message
+        content = (
+            'Send a notification to all of the specified servers '
+            'identified by their URLs',
+            'the content provided within the title, body and '
+            'notification-type.',
+            '',
+            'For a list of all of the supported services and information on '
+            'how to use ',
+            'them, check out at https://github.com/caronc/apprise')
+
+        for line in content:
+            formatter.write_text(line)
+
+        # Display options and arguments in the default format
+        self.format_options(ctx, formatter)
+        self.format_epilog(ctx, formatter)
+
+        # Custom 'Actions:' section after the 'Options:'
+        formatter.write_text('')
+        formatter.write_text('Actions:')
+
+        actions = [(
+            'storage', 'Access the persistent storage disk administration',
+            [(
+                'list',
+                'List all URL IDs associated with detected URL(s). '
+                'This is also the default action ran if nothing is provided',
+            ), (
+                'prune',
+                'Eliminates stale entries found based on '
+                '--storage-prune-days (-SPD)',
+            ), (
+                'clean',
+                'Removes any persistent data created by Apprise',
+            )],
+        )]
+
+        #
+        # Some variables
+        #
+
+        # actions are indented this many spaces
+        # sub actions double this value
+        action_indent = 2
+
+        # label padding (for alignment)
+        action_label_width = 10
+
+        space = ' '
+        space_re = re.compile(r'\r*\n')
+        cols = 80
+        indent = 10
+
+        # Format each action and its subactions
+        for action, description, sub_actions in actions:
+            # Our action indent
+            ai = ' ' * action_indent
+            # Format the main action description
+            formatted_description = space_re.split(textwrap.fill(
+                description, width=(cols - indent - action_indent),
+                initial_indent=space * indent,
+                subsequent_indent=space * indent))
+            for no, line in enumerate(formatted_description):
+                if not no:
+                    formatter.write_text(
+                        f'{ai}{action:<{action_label_width}}{line}')
+
+                else:  # pragma: no cover
+                    # Note: no branch is set intentionally since this is not
+                    #       tested since in 2024.08.13 when this was set up
+                    #       it never entered this area of the code.  But we
+                    #       know it works because we repeat this process with
+                    #       our sub-options below
+                    formatter.write_text(
+                        f'{ai}{space:<{action_label_width}}{line}')
+
+            # Format each subaction
+            ai = ' ' * (action_indent * 2)
+            for action, description in sub_actions:
+                formatted_description = space_re.split(textwrap.fill(
+                    description, width=(cols - indent - (action_indent * 3)),
+                    initial_indent=space * (indent - action_indent),
+                    subsequent_indent=space * (indent - action_indent)))
+
+                for no, line in enumerate(formatted_description):
+                    if not no:
+                        formatter.write_text(
+                            f'{ai}{action:<{action_label_width}}{line}')
+                    else:
+                        formatter.write_text(
+                            f'{ai}{space:<{action_label_width}}{line}')
+
+        # Include any epilog or additional text
+        self.format_epilog(ctx, formatter)
+
+
+@click.command(context_settings=CONTEXT_SETTINGS, cls=CustomHelpCommand)
 @click.option('--body', '-b', default=None, type=str,
               help='Specify the message body. If no body is specified then '
               'content is read from <stdin>.')
@@ -190,23 +317,43 @@ def print_version_msg():
 @click.option('--plugin-path', '-P', default=None, type=str, multiple=True,
               metavar='PLUGIN_PATH',
               help='Specify one or more plugin paths to scan.')
+@click.option('--storage-path', '-S', default=DEFAULT_STORAGE_PATH, type=str,
+              metavar='STORAGE_PATH',
+              help='Specify the path to the persistent storage location '
+              '(default={}).'.format(DEFAULT_STORAGE_PATH))
+@click.option('--storage-prune-days', '-SPD', default=30,
+              type=int,
+              help='Define the number of days the storage prune '
+              'should run using. Setting this to zero (0) will eliminate '
+              'all accumulated content. By default this value is 30 (days).')
+@click.option('--storage-uid-length', '-SUL', default=8,
+              type=int,
+              help='Define the number of unique characters to store persistent'
+              'cache in. By default this value is 6 (characters).')
+@click.option('--storage-mode', '-SM', default=PERSISTENT_STORE_MODES[0],
+              type=str, metavar='MODE',
+              help='Persistent disk storage write mode (default={}). '
+              'Possible values are "{}", and "{}".'.format(
+                  PERSISTENT_STORE_MODES[0], '", "'.join(
+                      PERSISTENT_STORE_MODES[:-1]),
+                  PERSISTENT_STORE_MODES[-1]))
 @click.option('--config', '-c', default=None, type=str, multiple=True,
               metavar='CONFIG_URL',
               help='Specify one or more configuration locations.')
 @click.option('--attach', '-a', default=None, type=str, multiple=True,
               metavar='ATTACHMENT_URL',
               help='Specify one or more attachment.')
-@click.option('--notification-type', '-n', default=NotifyType.INFO, type=str,
+@click.option('--notification-type', '-n', default=NOTIFY_TYPES[0], type=str,
               metavar='TYPE',
               help='Specify the message type (default={}). '
               'Possible values are "{}", and "{}".'.format(
-                  NotifyType.INFO, '", "'.join(NOTIFY_TYPES[:-1]),
+                  NOTIFY_TYPES[0], '", "'.join(NOTIFY_TYPES[:-1]),
                   NOTIFY_TYPES[-1]))
-@click.option('--input-format', '-i', default=NotifyFormat.TEXT, type=str,
+@click.option('--input-format', '-i', default=NOTIFY_FORMATS[0], type=str,
               metavar='FORMAT',
               help='Specify the message input format (default={}). '
               'Possible values are "{}", and "{}".'.format(
-                  NotifyFormat.TEXT, '", "'.join(NOTIFY_FORMATS[:-1]),
+                  NOTIFY_FORMATS[0], '", "'.join(NOTIFY_FORMATS[:-1]),
                   NOTIFY_FORMATS[-1]))
 @click.option('--theme', '-T', default='default', type=str, metavar='THEME',
               help='Specify the default theme.')
@@ -241,10 +388,12 @@ def print_version_msg():
               help='Display the apprise version and exit.')
 @click.argument('urls', nargs=-1,
                 metavar='SERVER_URL [SERVER_URL2 [SERVER_URL3]]',)
-def main(body, title, config, attach, urls, notification_type, theme, tag,
+@click.pass_context
+def main(ctx, body, title, config, attach, urls, notification_type, theme, tag,
          input_format, dry_run, recursion_depth, verbose, disable_async,
-         details, interpret_escapes, interpret_emojis, plugin_path, debug,
-         version):
+         details, interpret_escapes, interpret_emojis, plugin_path,
+         storage_path, storage_mode, storage_prune_days, storage_uid_length,
+         debug, version):
     """
     Send a notification to all of the specified servers identified by their
     URLs the content provided within the title, body and notification-type.
@@ -253,7 +402,7 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
     use them, check out at https://github.com/caronc/apprise
     """
     # Note: Click ignores the return values of functions it wraps, If you
-    #       want to return a specific error code, you must call sys.exit()
+    #       want to return a specific error code, you must call ctx.exit()
     #       as you will see below.
 
     debug = True if debug else False
@@ -297,7 +446,7 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
 
     if version:
         print_version_msg()
-        sys.exit(0)
+        ctx.exit(0)
 
     # Simple Error Checking
     notification_type = notification_type.strip().lower()
@@ -307,7 +456,7 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
             .format(notification_type))
         # 2 is the same exit code returned by Click if there is a parameter
         # issue.  For consistency, we also return a 2
-        sys.exit(2)
+        ctx.exit(2)
 
     input_format = input_format.strip().lower()
     if input_format not in NOTIFY_FORMATS:
@@ -316,13 +465,31 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
             .format(input_format))
         # 2 is the same exit code returned by Click if there is a parameter
         # issue.  For consistency, we also return a 2
-        sys.exit(2)
+        ctx.exit(2)
+
+    storage_mode = storage_mode.strip().lower()
+    if storage_mode not in PERSISTENT_STORE_MODES:
+        logger.error(
+            'The --storage-mode (-SM) value of {} is not supported.'
+            .format(storage_mode))
+        # 2 is the same exit code returned by Click if there is a parameter
+        # issue.  For consistency, we also return a 2
+        ctx.exit(2)
 
     if not plugin_path:
         # Prepare a default set of plugin path
         plugin_path = \
-            next((path for path in DEFAULT_PLUGIN_PATHS
-                 if exists(expanduser(path))), None)
+            [path for path in DEFAULT_PLUGIN_PATHS
+             if exists(path_decode(path))]
+
+    if storage_uid_length < 2:
+        logger.error(
+            'The --storage-uid-length (-SUL) value can not be lower '
+            'then two (2).')
+
+        # 2 is the same exit code returned by Click if there is a
+        # parameter issue.  For consistency, we also return a 2
+        ctx.exit(2)
 
     # Prepare our asset
     asset = AppriseAsset(
@@ -346,6 +513,15 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
 
         # Load our plugins
         plugin_paths=plugin_path,
+
+        # Load our persistent storage path
+        storage_path=path_decode(storage_path),
+
+        # Our storage URL ID Length
+        storage_idlen=storage_uid_length,
+
+        # Define if we flush to disk as soon as possible or not when required
+        storage_mode=storage_mode
     )
 
     # Create our Apprise object
@@ -429,7 +605,7 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
             # new line padding between entries
             click.echo()
 
-        sys.exit(0)
+        ctx.exit(0)
         # end if details()
 
     # The priorities of what is accepted are parsed in order below:
@@ -439,7 +615,7 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
     #    4. Configuration by environment variable: APPRISE_CONFIG
     #    5. Default Configuration File(s) (if found)
     #
-    if urls:
+    elif urls and not 'storage'.startswith(urls[0]):
         if tag:
             # Ignore any tags specified
             logger.warning(
@@ -483,20 +659,145 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
     else:
         # Load default configuration
         a.add(AppriseConfig(
-            paths=[f for f in DEFAULT_CONFIG_PATHS if isfile(expanduser(f))],
+            paths=[f for f in DEFAULT_CONFIG_PATHS if isfile(path_decode(f))],
             asset=asset, recursion=recursion_depth))
 
     if len(a) == 0 and not urls:
         logger.error(
             'You must specify at least one server URL or populated '
             'configuration file.')
-        print_help_msg(main)
-        sys.exit(1)
+        click.echo(ctx.get_help())
+        ctx.exit(1)
 
     # each --tag entry comprises of a comma separated 'and' list
     # we or each of of the --tag and sets specified.
     tags = None if not tag else [parse_list(t) for t in tag]
 
+    # Determine if we're dealing with URLs or url_ids based on the first
+    # entry provided.
+    if urls and 'storage'.startswith(urls[0]):
+        #
+        # Storage Mode
+        #  - urls are now to be interpreted as best matching namespaces
+        #
+        if storage_prune_days < 0:
+            logger.error(
+                'The --storage-prune-days (-SPD) value can not be lower '
+                'then zero (0).')
+
+            # 2 is the same exit code returned by Click if there is a
+            # parameter issue.  For consistency, we also return a 2
+            ctx.exit(2)
+
+        # Number of columns to assume in the terminal.  In future, maybe this
+        # can be detected and made dynamic. The actual column count is 80, but
+        # 5 characters are already reserved for the counter on the left
+        (columns, _) = shutil.get_terminal_size(fallback=(80, 24))
+
+        filter_uids = urls[1:]
+        action = PERSISTENT_STORAGE_MODES[0]
+        if filter_uids:
+            _action = next(  # pragma: no branch
+                (a for a in PERSISTENT_STORAGE_MODES
+                 if a.startswith(filter_uids[0])), None)
+
+            if _action:
+                # pop top entry
+                filter_uids = filter_uids[1:]
+                action = _action
+
+        # Get our detected URL IDs
+        uids = {}
+        for plugin in (a if not tags else a.find(tag=tags)):
+            _id = plugin.url_id()
+            if not _id:
+                continue
+
+            if filter_uids and next(
+                    (False for n in filter_uids if _id.startswith(n)), True):
+                continue
+
+            if _id not in uids:
+                uids[_id] = {
+                    'plugins': [plugin],
+                    'state': PersistentStoreState.UNUSED,
+                    'size': 0,
+                }
+
+            else:
+                # It's possible to have more then one URL point to the same
+                # location (thus match against the same url id more then once
+                uids[_id]['plugins'].append(plugin)
+
+        if action == PersistentStorageMode.LIST:
+            detected_uid = PersistentStore.disk_scan(
+                # Use our asset path as it has already been properly parsed
+                path=asset.storage_path,
+
+                # Provide filter if specified
+                namespace=filter_uids,
+            )
+            for _id in detected_uid:
+                size, _ = dir_size(os.path.join(asset.storage_path, _id))
+                if _id in uids:
+                    uids[_id]['state'] = PersistentStoreState.ACTIVE
+                    uids[_id]['size'] = size
+
+                elif not tags:
+                    uids[_id] = {
+                        'plugins': [],
+                        # No cross reference (wasted space?)
+                        'state': PersistentStoreState.STALE,
+                        # Acquire disk space
+                        'size': size,
+                    }
+
+            for idx, (uid, meta) in enumerate(uids.items()):
+                fg = "green" \
+                    if meta['state'] == PersistentStoreState.ACTIVE else (
+                        "red"
+                        if meta['state'] == PersistentStoreState.STALE else
+                        "white")
+
+                if idx > 0:
+                    # New line
+                    click.echo()
+                click.echo("{: 4d}. ".format(idx + 1), nl=False)
+                click.echo(click.style("{:<52} {:<8} {}".format(
+                    uid, bytes_to_str(meta['size']), meta['state']),
+                    fg=fg, bold=True))
+
+                for entry in meta['plugins']:
+                    url = entry.url(privacy=True)
+                    click.echo("{:>7} {}".format(
+                        '-',
+                        url if len(url) <= (columns - 8) else '{}...'.format(
+                            url[:columns - 11])))
+
+                    if entry.tags:
+                        click.echo("{:>10}: {}".format(
+                            'tags', ', '.join(entry.tags)))
+
+        else:  # PersistentStorageMode.PRUNE or PersistentStorageMode.CLEAR
+            if action == PersistentStorageMode.CLEAR:
+                storage_prune_days = 0
+
+            # clean up storage
+            results = PersistentStore.disk_prune(
+                # Use our asset path as it has already been properly parsed
+                path=asset.storage_path,
+                # Provide our namespaces if they exist
+                namespace=None if not filter_uids else filter_uids,
+                # Convert expiry from days to seconds
+                expires=storage_prune_days * 60 * 60 * 24,
+                action=not dry_run)
+
+            ctx.exit(0)
+            # end if disk_prune()
+
+        ctx.exit(0)
+        # end if storage()
+
     if not dry_run:
         if body is None:
             logger.trace('No --body (-b) specified; reading from stdin')
@@ -508,10 +809,10 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
             body=body, title=title, notify_type=notification_type, tag=tags,
             attach=attach)
     else:
-        # Number of rows to assume in the terminal.  In future, maybe this can
-        # be detected and made dynamic. The actual row count is 80, but 5
-        # characters are already reserved for the counter on the left
-        rows = 75
+        # Number of columns to assume in the terminal.  In future, maybe this
+        # can be detected and made dynamic. The actual column count is 80, but
+        # 5 characters are already reserved for the counter on the left
+        (columns, _) = shutil.get_terminal_size(fallback=(80, 24))
 
         # Initialize our URL response;  This is populated within the for/loop
         # below; but plays a factor at the end when we need to determine if
@@ -520,11 +821,18 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
 
         for idx, server in enumerate(a.find(tag=tags)):
             url = server.url(privacy=True)
-            click.echo("{: 3d}. {}".format(
+            click.echo("{: 4d}. {}".format(
                 idx + 1,
-                url if len(url) <= rows else '{}...'.format(url[:rows - 3])))
+                url if len(url) <= (columns - 8) else '{}...'.format(
+                    url[:columns - 9])))
+
+            # Share our URL ID
+            click.echo("{:>10}: {}".format(
+                'uid', '- n/a -' if not server.url_id()
+                else server.url_id()))
+
             if server.tags:
-                click.echo("{} - {}".format(' ' * 5, ', '.join(server.tags)))
+                click.echo("{:>10}: {}".format('tags', ', '.join(server.tags)))
 
         # Initialize a default response of nothing matched, otherwise
         # if we matched at least one entry, we can return True
@@ -537,11 +845,11 @@ def main(body, title, config, attach, urls, notification_type, theme, tag,
 
         # Exit code 3 is used since Click uses exit code 2 if there is an
         # error with the parameters specified
-        sys.exit(3)
+        ctx.exit(3)
 
     elif result is False:
         # At least 1 notification service failed to send
-        sys.exit(1)
+        ctx.exit(1)
 
     # else:  We're good!
-    sys.exit(0)
+    ctx.exit(0)
diff --git a/apprise/common.py b/apprise/common.py
index d6fe2cd0..a8e9cd34 100644
--- a/apprise/common.py
+++ b/apprise/common.py
@@ -187,6 +187,42 @@ CONTENT_LOCATIONS = (
     ContentLocation.INACCESSIBLE,
 )
 
+
+class PersistentStoreMode:
+    # Allow persistent storage; write on demand
+    AUTO = 'auto'
+
+    # Always flush every change to disk after it's saved. This has higher i/o
+    # but enforces disk reflects what was set immediately
+    FLUSH = 'flush'
+
+    # memory based store only
+    MEMORY = 'memory'
+
+
+PERSISTENT_STORE_MODES = (
+    PersistentStoreMode.AUTO,
+    PersistentStoreMode.FLUSH,
+    PersistentStoreMode.MEMORY,
+)
+
+
+class PersistentStoreState:
+    """
+    Defines the persistent states describing what has been cached
+    """
+    # Persistent Directory is actively cross-referenced against a matching URL
+    ACTIVE = 'active'
+
+    # Persistent Directory is no longer being used or has no cross-reference
+    STALE = 'stale'
+
+    # Persistent Directory is not utilizing any disk space at all, however
+    # it potentially could if the plugin it successfully cross-references
+    # is utilized
+    UNUSED = 'unused'
+
+
 # This is a reserved tag that is automatically assigned to every
 # Notification Plugin
 MATCH_ALL_TAG = 'all'
diff --git a/apprise/config/file.py b/apprise/config/file.py
index 9f29ca20..9340f62b 100644
--- a/apprise/config/file.py
+++ b/apprise/config/file.py
@@ -29,6 +29,7 @@
 import re
 import os
 from .base import ConfigBase
+from ..utils import path_decode
 from ..common import ConfigFormat
 from ..common import ContentIncludeMode
 from ..locale import gettext_lazy as _
@@ -59,7 +60,10 @@ class ConfigFile(ConfigBase):
         super().__init__(**kwargs)
 
         # Store our file path as it was set
-        self.path = os.path.abspath(os.path.expanduser(path))
+        self.path = path_decode(path)
+
+        # Track the file as it was saved
+        self.__original_path = os.path.normpath(path)
 
         # Update the config path to be relative to our file we just loaded
         self.config_path = os.path.dirname(self.path)
@@ -89,7 +93,7 @@ class ConfigFile(ConfigBase):
             params['format'] = self.config_format
 
         return 'file://{path}{params}'.format(
-            path=self.quote(self.path),
+            path=self.quote(self.__original_path),
             params='?{}'.format(self.urlencode(params)) if params else '',
         )
 
diff --git a/apprise/exception.py b/apprise/exception.py
new file mode 100644
index 00000000..40967f5a
--- /dev/null
+++ b/apprise/exception.py
@@ -0,0 +1,53 @@
+# -*- coding: utf-8 -*-
+# BSD 2-Clause License
+#
+# Apprise - Push Notification Library.
+# Copyright (c) 2024, Chris Caron <lead2gold@gmail.com>
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are met:
+#
+# 1. Redistributions of source code must retain the above copyright notice,
+#    this list of conditions and the following disclaimer.
+#
+# 2. Redistributions in binary form must reproduce the above copyright notice,
+#    this list of conditions and the following disclaimer in the documentation
+#    and/or other materials provided with the distribution.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
+# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
+# POSSIBILITY OF SUCH DAMAGE.
+import errno
+
+
+class AppriseException(Exception):
+    """
+    Base Apprise Exception Class
+    """
+    def __init__(self, message, error_code=0):
+        super().__init__(message)
+        self.error_code = error_code
+
+
+class AppriseDiskIOError(AppriseException):
+    """
+    Thrown when an disk i/o error occurs
+    """
+    def __init__(self, message, error_code=errno.EIO):
+        super().__init__(message, error_code=error_code)
+
+
+class AppriseFileNotFound(AppriseDiskIOError, FileNotFoundError):
+    """
+    Thrown when a persistent write occured in MEMORY mode
+    """
+    def __init__(self, message):
+        super().__init__(message, error_code=errno.ENOENT)
diff --git a/apprise/manager.py b/apprise/manager.py
index abaf8cfb..ab7f6c99 100644
--- a/apprise/manager.py
+++ b/apprise/manager.py
@@ -36,6 +36,7 @@ import threading
 from .utils import import_module
 from .utils import Singleton
 from .utils import parse_list
+from .utils import path_decode
 from os.path import dirname
 from os.path import abspath
 from os.path import join
@@ -373,7 +374,7 @@ class PluginManager(metaclass=Singleton):
             return
 
         for _path in paths:
-            path = os.path.abspath(os.path.expanduser(_path))
+            path = path_decode(_path)
             if (cache and path in self._paths_previously_scanned) \
                     or not os.path.exists(path):
                 # We're done as we've already scanned this
diff --git a/apprise/persistent_store.py b/apprise/persistent_store.py
new file mode 100644
index 00000000..58eb99e9
--- /dev/null
+++ b/apprise/persistent_store.py
@@ -0,0 +1,1676 @@
+# -*- coding: utf-8 -*-
+#
+# Copyright (C) 2024 Chris Caron <lead2gold@gmail.com>
+# All rights reserved.
+#
+# This code is licensed under the MIT License.
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files(the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and / or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions :
+#
+# The above copyright notice and this permission notice shall be included in
+# all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.IN NO EVENT SHALL THE
+# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+# THE SOFTWARE.
+import os
+import re
+import gzip
+import zlib
+import base64
+import glob
+import tempfile
+import json
+import binascii
+from . import exception
+from itertools import chain
+from datetime import datetime, timezone, timedelta
+import time
+import hashlib
+from .common import PersistentStoreMode, PERSISTENT_STORE_MODES
+from .utils import path_decode
+from .logger import logger
+
+# Used for writing/reading time stored in cache file
+EPOCH = datetime(1970, 1, 1, tzinfo=timezone.utc)
+
+# isoformat is spelled out for compatibility with Python v3.6
+AWARE_DATE_ISO_FORMAT = '%Y-%m-%dT%H:%M:%S.%f%z'
+NAIVE_DATE_ISO_FORMAT = '%Y-%m-%dT%H:%M:%S.%f'
+
+
+def _ntf_tidy(ntf):
+    """
+    Reusable NamedTemporaryFile cleanup
+    """
+    if ntf:
+        # Cleanup
+        try:
+            ntf.close()
+
+        except OSError:
+            # Already closed
+            pass
+
+        try:
+            os.unlink(ntf.name)
+            logger.trace(
+                'Persistent temporary file removed: %s', ntf.name)
+
+        except (FileNotFoundError, AttributeError):
+            # AttributeError: something weird was passed in, no action required
+            # FileNotFound: no worries; we were removing it anyway
+            pass
+
+        except (OSError, IOError) as e:
+            logger.error(
+                'Persistent temporary file removal failed: %s',
+                ntf.name)
+            logger.debug(
+                'Persistent Storage Exception: %s' % str(e))
+
+
+class CacheObject:
+
+    hash_engine = hashlib.sha256
+    hash_length = 6
+
+    def __init__(self, value=None, expires=False, persistent=True):
+        """
+        Tracks our objects and associates a time limit with them
+        """
+
+        self.__value = value
+        self.__class_name = value.__class__.__name__
+        self.__expires = None
+
+        if expires:
+            self.set_expiry(expires)
+
+        # Whether or not we persist this object to disk or not
+        self.__persistent = True if persistent else False
+
+    def set(self, value, expires=None, persistent=None):
+        """
+        Sets fields on demand, if set to none, then they are left as is
+
+        The intent of set is that it allows you to set a new a value
+        and optionally alter meta information against it.
+
+        If expires or persistent isn't specified then their previous values
+        are used.
+
+        """
+
+        self.__value = value
+        self.__class_name = value.__class__.__name__
+        if expires is not None:
+            self.set_expiry(expires)
+
+        if persistent is not None:
+            self.__persistent = True if persistent else False
+
+    def set_expiry(self, expires=None):
+        """
+        Sets a new expiry
+        """
+
+        if isinstance(expires, datetime):
+            self.__expires = expires.astimezone(timezone.utc)
+
+        elif expires in (None, False):
+            # Accepted - no expiry
+            self.__expires = None
+
+        elif expires is True:
+            # Force expiry to now
+            self.__expires = datetime.now(tz=timezone.utc)
+
+        elif isinstance(expires, (float, int)):
+            self.__expires = \
+                datetime.now(tz=timezone.utc) + timedelta(seconds=expires)
+
+        else:  # Unsupported
+            raise AttributeError(
+                f"An invalid expiry time ({expires} was specified")
+
+    def hash(self):
+        """
+        Our checksum to track the validity of our data
+        """
+        try:
+            return self.hash_engine(
+                str(self).encode('utf-8'), usedforsecurity=False).hexdigest()
+
+        except TypeError:
+            # Python <= v3.8 - usedforsecurity flag does not work
+            return self.hash_engine(str(self).encode('utf-8')).hexdigest()
+
+    def json(self):
+        """
+        Returns our preparable json object
+        """
+
+        return {
+            'v': self.__value,
+            'x': (self.__expires - EPOCH).total_seconds()
+            if self.__expires else None,
+            'c': self.__class_name if not isinstance(self.__value, datetime)
+            else (
+                'aware_datetime' if self.__value.tzinfo else 'naive_datetime'),
+            '!': self.hash()[:self.hash_length],
+        }
+
+    @staticmethod
+    def instantiate(content, persistent=True, verify=True):
+        """
+        Loads back data read in and returns a CacheObject or None if it could
+        not be loaded. You can pass in the contents of CacheObject.json() and
+        you'll receive a copy assuming the hash checks okay
+
+        """
+        try:
+            value = content['v']
+            expires = content['x']
+            if expires is not None:
+                expires = datetime.fromtimestamp(expires, timezone.utc)
+
+            # Acquire some useful integrity objects
+            class_name = content.get('c', '')
+            if not isinstance(class_name, str):
+                raise TypeError('Class name not expected string')
+
+            hashsum = content.get('!', '')
+            if not isinstance(hashsum, str):
+                raise TypeError('SHA1SUM not expected string')
+
+        except (TypeError, KeyError) as e:
+            logger.trace(f'CacheObject could not be parsed from {content}')
+            logger.trace('CacheObject exception: %s' % str(e))
+            return None
+
+        if class_name in ('aware_datetime', 'naive_datetime', 'datetime'):
+            # If datetime is detected, it will fall under the naive category
+            iso_format = AWARE_DATE_ISO_FORMAT \
+                if class_name[0] == 'a' else NAIVE_DATE_ISO_FORMAT
+            try:
+                # Python v3.6 Support
+                value = datetime.strptime(value, iso_format)
+
+            except (TypeError, ValueError):
+                # TypeError is thrown if content is not string
+                # ValueError is thrown if the string is not a valid format
+                logger.trace(
+                    f'CacheObject (dt) corrupted loading from {content}')
+                return None
+
+        elif class_name == 'bytes':
+            try:
+                # Convert our object back to a bytes
+                value = base64.b64decode(value)
+
+            except binascii.Error:
+                logger.trace(
+                    f'CacheObject (bin) corrupted loading from {content}')
+                return None
+
+        # Initialize our object
+        co = CacheObject(value, expires, persistent=persistent)
+        if verify and co.hash()[:co.hash_length] != hashsum:
+            # Our object was tampered with
+            logger.debug(f'Tampering detected with cache entry {co}')
+            del co
+            return None
+
+        return co
+
+    @property
+    def value(self):
+        """
+        Returns our value
+        """
+        return self.__value
+
+    @property
+    def persistent(self):
+        """
+        Returns our persistent value
+        """
+        return self.__persistent
+
+    @property
+    def expires(self):
+        """
+        Returns the datetime the object will expire
+        """
+        return self.__expires
+
+    @property
+    def expires_sec(self):
+        """
+        Returns the number of seconds from now the object will expire
+        """
+
+        return None if self.__expires is None else max(
+            0.0, (self.__expires - datetime.now(tz=timezone.utc))
+            .total_seconds())
+
+    def __bool__(self):
+        """
+        Returns True it the object hasn't expired, and False if it has
+        """
+        if self.__expires is None:
+            # No Expiry
+            return True
+
+        # Calculate if we've expired or not
+        return self.__expires > datetime.now(tz=timezone.utc)
+
+    def __eq__(self, other):
+        """
+        Handles equality == flag
+        """
+        if isinstance(other, CacheObject):
+            return str(self) == str(other)
+
+        return self.__value == other
+
+    def __str__(self):
+        """
+        string output of our data
+        """
+        persistent = '+' if self.persistent else '-'
+        return f'{self.__class_name}:{persistent}:{self.__value} expires: ' +\
+            ('never' if self.__expires is None
+             else self.__expires.strftime(NAIVE_DATE_ISO_FORMAT))
+
+
+class CacheJSONEncoder(json.JSONEncoder):
+    """
+    A JSON Encoder for handling each of our cache objects
+    """
+
+    def default(self, entry):
+        if isinstance(entry, datetime):
+            return entry.strftime(
+                AWARE_DATE_ISO_FORMAT if entry.tzinfo is not None
+                else NAIVE_DATE_ISO_FORMAT)
+
+        elif isinstance(entry, CacheObject):
+            return entry.json()
+
+        elif isinstance(entry, bytes):
+            return base64.b64encode(entry).decode('utf-8')
+
+        return super().default(entry)
+
+
+class PersistentStore:
+    """
+    An object to make working with persistent storage easier
+
+    read() and write() are used for direct file i/o
+
+    set(), get() are used for caching
+    """
+
+    # The maximum file-size we will allow the persistent store to grow to
+    # 1 MB = 1048576 bytes
+    max_file_size = 1048576
+
+    # 30 days in seconds
+    default_file_expiry = 2678400
+
+    # File encoding to use
+    encoding = 'utf-8'
+
+    # Default data set
+    base_key = 'default'
+
+    # Directory to store cache
+    __cache_key = 'cache'
+
+    # Our Temporary working directory
+    temp_dir = 'tmp'
+
+    # The directory our persistent store content gets placed in
+    data_dir = 'var'
+
+    # Our Persistent Store File Extension
+    __extension = '.psdata'
+
+    # Identify our backup file extension
+    __backup_extension = '._psbak'
+
+    # Used to verify the key specified is valid
+    #  - must start with an alpha_numeric
+    #  - following optional characters can include period, underscore and
+    #    equal
+    __valid_key = re.compile(r'[a-z0-9][a-z0-9._-]*', re.I)
+
+    # Reference only
+    __not_found_ref = (None, None)
+
+    def __init__(self, path=None, namespace='default', mode=None):
+        """
+        Provide the namespace to work within. namespaces can only contain
+        alpha-numeric characters with the exception of '-' (dash), '_'
+        (underscore), and '.' (period). The namespace must be be relative
+        to the current URL being controlled.
+        """
+        # Initalize our mode so __del__() calls don't go bad on the
+        # error checking below
+        self.__mode = None
+
+        # Populated only once and after size() is called
+        self.__exclude_list = None
+
+        # Files to renew on calls to flush
+        self.__renew = set()
+
+        if not isinstance(namespace, str) \
+                or not self.__valid_key.match(namespace):
+            raise AttributeError(
+                f"Persistent Storage namespace ({namespace}) provided is"
+                " invalid")
+
+        if isinstance(path, str):
+            # A storage path has been defined
+            if mode is None:
+                # Store Default if no mode was provided along side of it
+                mode = PERSISTENT_STORE_MODES[0]
+
+            # Store our information
+            self.__base_path = os.path.join(path_decode(path), namespace)
+            self.__temp_path = os.path.join(self.__base_path, self.temp_dir)
+            self.__data_path = os.path.join(self.__base_path, self.data_dir)
+
+        else:  # If no storage path is provide we set our mode to MEMORY
+            mode = PersistentStoreMode.MEMORY
+            self.__base_path = None
+            self.__temp_path = None
+            self.__data_path = None
+
+        if mode not in PERSISTENT_STORE_MODES:
+            raise AttributeError(
+                f"Persistent Storage mode ({mode}) provided is invalid")
+
+        # Store our mode
+        self.__mode = mode
+
+        # Tracks when we have content to flush
+        self.__dirty = False
+
+        # A caching value to track persistent storage disk size
+        self.__cache_size = None
+        self.__cache_files = {}
+
+        # Internal Cache
+        self._cache = None
+
+        # Prepare our environment
+        self.__prepare()
+
+    def read(self, key=None, compress=True, expires=False):
+        """
+        Returns the content of the persistent store object
+
+        if refresh is set to True, then the file's modify time is updated
+        preventing it from getting caught in prune calls.  It's a means
+        of allowing it to persist and not get cleaned up in later prune
+        calls.
+
+        Content is always returned as a byte object
+        """
+        try:
+            with self.open(key, mode="rb", compress=compress) as fd:
+                results = fd.read(self.max_file_size)
+                if expires is False:
+                    self.__renew.add(os.path.join(
+                        self.__data_path, f"{key}{self.__extension}"))
+
+                return results
+
+        except (FileNotFoundError, exception.AppriseDiskIOError):
+            # FileNotFoundError: No problem
+            # exception.AppriseDiskIOError:
+            #   - Logging of error already occurred inside self.open()
+            pass
+
+        except (OSError, zlib.error, EOFError, UnicodeDecodeError,
+                IOError) as e:
+            # We can't access the file or it does not exist
+            logger.warning('Could not read with persistent key: %s', key)
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+
+        # return none
+        return None
+
+    def write(self, data, key=None, compress=True, _recovery=False):
+        """
+        Writes the content to the persistent store if it doesn't exceed our
+        filesize limit.
+
+        Content is always written as a byte object
+
+        _recovery is reserved for internal usage and should not be changed
+        """
+
+        if key is None:
+            key = self.base_key
+
+        elif not isinstance(key, str) or not self.__valid_key.match(key):
+            raise AttributeError(
+                f"Persistent Storage key ({key} provided is invalid")
+
+        if not isinstance(data, (bytes, str)):
+            # One last check, we will accept read() objets with the expectation
+            # it will return a binary dataset
+            if not (hasattr(data, 'read') and callable(getattr(data, 'read'))):
+                raise AttributeError(
+                    "Invalid data type {} provided to Persistent Storage"
+                    .format(type(data)))
+
+            try:
+                # Read in our data
+                data = data.read()
+                if not isinstance(data, (bytes, str)):
+                    raise AttributeError(
+                        "Invalid data type {} provided to Persistent Storage"
+                        .format(type(data)))
+
+            except Exception as e:
+                logger.warning(
+                    'Could read() from potential iostream with persistent '
+                    'key: %s', key)
+                logger.debug('Persistent Storage Exception: %s' % str(e))
+                raise exception.AppriseDiskIOError(
+                    "Invalid data type {} provided to Persistent Storage"
+                    .format(type(data)))
+
+        if self.__mode == PersistentStoreMode.MEMORY:
+            # Nothing further can be done
+            return False
+
+        if _recovery:
+            # Attempt to recover from a bad directory structure or setup
+            self.__prepare()
+
+        # generate our filename based on the key provided
+        io_file = os.path.join(self.__data_path, f"{key}{self.__extension}")
+
+        # Calculate the files current filesize
+        try:
+            prev_size = os.stat(io_file).st_size
+
+        except FileNotFoundError:
+            # No worries, no size to accomodate
+            prev_size = 0
+
+        except (OSError, IOError) as e:
+            # Permission error of some kind or disk problem...
+            # There is nothing we can do at this point
+            logger.warning('Could not write with persistent key: %s', key)
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+            return False
+
+        # Create a temporary file to write our content into
+        # ntf = NamedTemporaryFile
+        ntf = None
+        new_file_size = 0
+        try:
+            if isinstance(data, str):
+                data = data.encode(self.encoding)
+
+            ntf = tempfile.NamedTemporaryFile(
+                mode="wb", dir=self.__temp_path,
+                delete=False)
+
+            # Close our file
+            ntf.close()
+
+            # Pointer to our open call
+            _open = open if not compress else gzip.open
+
+            with _open(ntf.name, mode='wb') as fd:
+                # Write our content
+                fd.write(data)
+
+            # Get our file size
+            new_file_size = os.stat(ntf.name).st_size
+
+            # Log our progress
+            logger.trace(
+                'Wrote %d bytes of data to persistent key: %s',
+                new_file_size, key)
+
+        except FileNotFoundError:
+            # This happens if the directory path is gone preventing the file
+            # from being created...
+            if not _recovery:
+                return self.write(
+                    data=data, key=key, compress=compress, _recovery=True)
+
+            # We've already made our best effort to recover if we are here in
+            # our code base... we're going to have to exit
+
+            # Tidy our Named Temporary File
+            _ntf_tidy(ntf)
+
+            # Early Exit
+            return False
+
+        except (OSError, UnicodeEncodeError, IOError, zlib.error) as e:
+            # We can't access the file or it does not exist
+            logger.warning('Could not write to persistent key: %s', key)
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+
+            # Tidy our Named Temporary File
+            _ntf_tidy(ntf)
+
+            return False
+
+        if self.max_file_size > 0 and (
+                new_file_size + self.size() - prev_size) > self.max_file_size:
+            # The content to store is to large
+            logger.warning(
+                'Persistent content exceeds allowable maximum file length '
+                '({}KB); provide {}KB'.format(
+                    int(self.max_file_size / 1024),
+                    int(new_file_size / 1024)))
+            return False
+
+        # Return our final move
+        if not self.__move(ntf.name, io_file):
+            # Attempt to restore things as they were
+
+            # Tidy our Named Temporary File
+            _ntf_tidy(ntf)
+            return False
+
+        # Resetour reference variables
+        self.__cache_size = None
+        self.__cache_files.clear()
+
+        # Content installed
+        return True
+
+    def __move(self, src, dst):
+        """
+        Moves the new file in place and handles the old if it exists already
+        If the transaction fails in any way, the old file is swapped back.
+
+        Function returns True if successful and False if not.
+        """
+
+        # A temporary backup of the file we want to move in place
+        dst_backup = dst[:-len(self.__backup_extension)] + \
+            self.__backup_extension
+
+        #
+        # Backup the old file (if it exists) allowing us to have a restore
+        # point in the event of a failure
+        #
+        try:
+            # make sure the file isn't already present; if it is; remove it
+            os.unlink(dst_backup)
+            logger.trace(
+                'Removed previous persistent backup file: %s', dst_backup)
+
+        except FileNotFoundError:
+            # no worries; we were removing it anyway
+            pass
+
+        except (OSError, IOError) as e:
+            # Permission error of some kind or disk problem...
+            # There is nothing we can do at this point
+            logger.warning(
+                'Could not previous persistent data backup: %s', dst_backup)
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+            return False
+
+        try:
+            # Back our file up so we have a fallback
+            os.rename(dst, dst_backup)
+            logger.trace(
+                'Persistent storage backup file created: %s', dst_backup)
+
+        except FileNotFoundError:
+            # Not a problem; this is a brand new file we're writing
+            # There is nothing to backup
+            pass
+
+        except (OSError, IOError) as e:
+            # This isn't good... we couldn't put our new file in place
+            logger.warning(
+                'Could not install persistent content %s -> %s',
+                dst, os.path.basename(dst_backup))
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+            return False
+
+        #
+        # Now place the new file
+        #
+        try:
+            os.rename(src, dst)
+            logger.trace('Persistent file installed: %s', dst)
+
+        except (OSError, IOError) as e:
+            # This isn't good... we couldn't put our new file in place
+            # Begin fall-back process before leaving the funtion
+            logger.warning(
+                'Could not install persistent content %s -> %s',
+                src, os.path.basename(dst))
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+            try:
+                # Restore our old backup (if it exists)
+                os.rename(dst_backup, dst)
+                logger.trace(
+                    'Restoring original persistent content: %s', dst)
+
+            except FileNotFoundError:
+                # Not a problem
+                pass
+
+            except (OSError, IOError) as e:
+                # Permission error of some kind or disk problem...
+                # There is nothing we can do at this point
+                logger.warning(
+                    'Failed to restore original persistent file: %s', dst)
+                logger.debug('Persistent Storage Exception: %s' % str(e))
+
+            return False
+
+        return True
+
+    def open(self, key=None, mode='r', buffering=-1, encoding=None,
+             errors=None, newline=None, closefd=True, opener=None,
+             compress=False, compresslevel=9):
+        """
+        Returns an iterator to our our file within our namespace identified
+        by the key provided.
+
+        If no key is provided, then the default is used
+        """
+
+        if key is None:
+            key = self.base_key
+
+        elif not isinstance(key, str) or not self.__valid_key.match(key):
+            raise AttributeError(
+                f"Persistent Storage key ({key} provided is invalid")
+
+        if self.__mode == PersistentStoreMode.MEMORY:
+            # Nothing further can be done
+            raise FileNotFoundError()
+
+        io_file = os.path.join(self.__data_path, f"{key}{self.__extension}")
+        try:
+            return open(
+                io_file, mode=mode, buffering=buffering, encoding=encoding,
+                errors=errors, newline=newline, closefd=closefd,
+                opener=opener) \
+                if not compress else gzip.open(
+                    io_file, compresslevel=compresslevel, encoding=encoding,
+                    errors=errors, newline=newline)
+
+        except FileNotFoundError:
+            # pass along (but wrap with Apprise exception)
+            raise exception.AppriseFileNotFound(
+                f"No such file or directory: '{io_file}'")
+
+        except (OSError, IOError, zlib.error) as e:
+            # We can't access the file or it does not exist
+            logger.warning('Could not read with persistent key: %s', key)
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+            raise exception.AppriseDiskIOError(str(e))
+
+    def get(self, key, default=None, lazy=True):
+        """
+        Fetches from cache
+        """
+
+        if self._cache is None and not self.__load_cache():
+            return default
+
+        if key in self._cache and \
+                not self.__mode == PersistentStoreMode.MEMORY and \
+                not self.__dirty:
+
+            # ensure we renew our content
+            self.__renew.add(self.cache_file)
+
+        return self._cache[key].value \
+            if key in self._cache and self._cache[key] else default
+
+    def set(self, key, value, expires=None, persistent=True, lazy=True):
+        """
+        Cache reference
+        """
+
+        if self._cache is None and not self.__load_cache():
+            return False
+
+        cache = CacheObject(value, expires, persistent=persistent)
+        # Fetch our cache value
+        try:
+            if lazy and cache == self._cache[key]:
+                # We're done; nothing further to do
+                return True
+
+        except KeyError:
+            pass
+
+        # Store our new cache
+        self._cache[key] = CacheObject(value, expires, persistent=persistent)
+
+        # Set our dirty flag
+        self.__dirty = persistent
+
+        if self.__dirty and self.__mode == PersistentStoreMode.FLUSH:
+            # Flush changes to disk
+            return self.flush()
+
+        return True
+
+    def clear(self, *args):
+        """
+        Remove one or more cache entry by it's key
+
+            e.g: clear('key')
+                 clear('key1', 'key2', key-12')
+
+        Or clear everything:
+                 clear()
+        """
+        if self._cache is None and not self.__load_cache():
+            return False
+
+        if args:
+            for arg in args:
+
+                try:
+                    del self._cache['key']
+
+                    # Set our dirty flag (if not set already)
+                    self.__dirty = True
+
+                except KeyError:
+                    pass
+
+        elif self._cache:
+            # Request to remove everything and there is something to remove
+
+            # Set our dirty flag (if not set already)
+            self.__dirty = True
+
+            # Reset our object
+            self._cache.clear()
+
+        if self.__dirty and self.__mode == PersistentStoreMode.FLUSH:
+            # Flush changes to disk
+            return self.flush()
+
+    def prune(self):
+        """
+        Eliminates expired cache entries
+        """
+        if self._cache is None and not self.__load_cache():
+            return False
+
+        change = False
+        for key in list(self._cache.keys()):
+            if key not in self:
+                # It's identified as being expired
+                if not change and self._cache[key].persistent:
+                    # track change only if content was persistent
+                    change = True
+
+                    # Set our dirty flag
+                    self.__dirty = True
+
+                del self._cache[key]
+
+        if self.__dirty and self.__mode == PersistentStoreMode.FLUSH:
+            # Flush changes to disk
+            return self.flush()
+
+        return change
+
+    def __load_cache(self, _recovery=False):
+        """
+        Loads our cache
+
+        _recovery is reserved for internal usage and should not be changed
+        """
+
+        # Prepare our dirty flag
+        self.__dirty = False
+
+        if self.__mode == PersistentStoreMode.MEMORY:
+            # Nothing further to do
+            self._cache = {}
+            return True
+
+        # Prepare our cache file
+        cache_file = self.cache_file
+        try:
+            with gzip.open(cache_file, 'rb') as f:
+                # Read our ontent from disk
+                self._cache = {}
+                for k, v in json.loads(f.read().decode(self.encoding)).items():
+                    co = CacheObject.instantiate(v)
+                    if co:
+                        # Verify our object before assigning it
+                        self._cache[k] = co
+
+                    elif not self.__dirty:
+                        # Track changes from our loadset
+                        self.__dirty = True
+
+        except (UnicodeDecodeError, json.decoder.JSONDecodeError, zlib.error,
+                TypeError, AttributeError, EOFError):
+
+            # Let users known there was a problem
+            logger.warning(
+                'Corrupted access persistent cache content: %s',
+                cache_file)
+
+            if not _recovery:
+                try:
+                    os.unlink(cache_file)
+                    logger.trace(
+                        'Removed previous persistent cache content: %s',
+                        cache_file)
+
+                except FileNotFoundError:
+                    # no worries; we were removing it anyway
+                    pass
+
+                except (OSError, IOError) as e:
+                    # Permission error of some kind or disk problem...
+                    # There is nothing we can do at this point
+                    logger.warning(
+                        'Could not remove persistent cache content: %s',
+                        cache_file)
+                    logger.debug('Persistent Storage Exception: %s' % str(e))
+                    return False
+                return self.__load_cache(_recovery=True)
+
+            return False
+
+        except FileNotFoundError:
+            # No problem; no cache to load
+            self._cache = {}
+
+        except (OSError, IOError) as e:
+            # Permission error of some kind or disk problem...
+            # There is nothing we can do at this point
+            logger.warning(
+                'Could not load persistent cache for namespace %s',
+                os.path.basename(self.__base_path))
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+            return False
+
+        # Ensure our dirty flag is set to False
+        return True
+
+    def __prepare(self, flush=True):
+        """
+        Prepares a working environment
+        """
+        if self.__mode != PersistentStoreMode.MEMORY:
+            # Ensure our path exists
+            try:
+                os.makedirs(self.__base_path, mode=0o770, exist_ok=True)
+
+            except (OSError, IOError) as e:
+                # Permission error
+                logger.debug(
+                    'Could not create persistent store directory %s',
+                    self.__base_path)
+                logger.debug('Persistent Storage Exception: %s' % str(e))
+
+                # Mode changed back to MEMORY
+                self.__mode = PersistentStoreMode.MEMORY
+
+            # Ensure our path exists
+            try:
+                os.makedirs(self.__temp_path, mode=0o770, exist_ok=True)
+
+            except (OSError, IOError) as e:
+                # Permission error
+                logger.debug(
+                    'Could not create persistent store directory %s',
+                    self.__temp_path)
+                logger.debug('Persistent Storage Exception: %s' % str(e))
+
+                # Mode changed back to MEMORY
+                self.__mode = PersistentStoreMode.MEMORY
+
+            try:
+                os.makedirs(self.__data_path, mode=0o770, exist_ok=True)
+
+            except (OSError, IOError) as e:
+                # Permission error
+                logger.debug(
+                    'Could not create persistent store directory %s',
+                    self.__data_path)
+                logger.debug('Persistent Storage Exception: %s' % str(e))
+
+                # Mode changed back to MEMORY
+                self.__mode = PersistentStoreMode.MEMORY
+
+            if self.__mode is PersistentStoreMode.MEMORY:
+                logger.warning(
+                    'The persistent storage could not be fully initialized; '
+                    'operating in MEMORY mode')
+
+            else:
+                if self._cache:
+                    # Recovery taking place
+                    self.__dirty = True
+                    logger.warning(
+                        'The persistent storage environment was disrupted')
+
+                    if self.__mode is PersistentStoreMode.FLUSH and flush:
+                        # Flush changes to disk
+                        return self.flush(_recovery=True)
+
+    def flush(self, force=False, _recovery=False):
+        """
+        Save's our cache to disk
+        """
+
+        if self._cache is None or self.__mode == PersistentStoreMode.MEMORY:
+            # nothing to do
+            return True
+
+        while self.__renew:
+            # update our files
+            path = self.__renew.pop()
+            ftime = time.time()
+
+            try:
+                # (access_time, modify_time)
+                os.utime(path, (ftime, ftime))
+                logger.trace('file timestamp updated: %s', path)
+
+            except FileNotFoundError:
+                # No worries... move along
+                pass
+
+            except (OSError, IOError) as e:
+                # We can't access the file or it does not exist
+                logger.debug('Could not update file timestamp: %s', path)
+                logger.debug('Persistent Storage Exception: %s' % str(e))
+
+        if not force and self.__dirty is False:
+            # Nothing further to do
+            logger.trace('Persistent cache is consistent with memory map')
+            return True
+
+        if _recovery:
+            # Attempt to recover from a bad directory structure or setup
+            self.__prepare(flush=False)
+
+        # Unset our size lazy setting
+        self.__cache_size = None
+        self.__cache_files.clear()
+
+        # Prepare our cache file
+        cache_file = self.cache_file
+        if not self._cache:
+            #
+            # We're deleting the cache file s there are no entries left in it
+            #
+            backup_file = cache_file[:-len(self.__backup_extension)] + \
+                self.__backup_extension
+
+            try:
+                os.unlink(backup_file)
+                logger.trace(
+                    'Removed previous persistent cache backup: %s',
+                    backup_file)
+
+            except FileNotFoundError:
+                # no worries; we were removing it anyway
+                pass
+
+            except (OSError, IOError) as e:
+                # Permission error of some kind or disk problem...
+                # There is nothing we can do at this point
+                logger.warning(
+                    'Could not remove persistent cache backup: %s',
+                    backup_file)
+                logger.debug('Persistent Storage Exception: %s' % str(e))
+                return False
+
+            try:
+                os.rename(cache_file, backup_file)
+                logger.trace(
+                    'Persistent cache backup file created: %s',
+                    backup_file)
+
+            except FileNotFoundError:
+                # Not a problem; do not create a log entry
+                pass
+
+            except (OSError, IOError) as e:
+                # This isn't good... we couldn't put our new file in place
+                logger.warning(
+                    'Could not remove stale persistent cache file: %s',
+                    cache_file)
+                logger.debug('Persistent Storage Exception: %s' % str(e))
+                return False
+            return True
+
+        #
+        # If we get here, we need to update our file based cache
+        #
+
+        # ntf = NamedTemporaryFile
+        ntf = None
+
+        try:
+            ntf = tempfile.NamedTemporaryFile(
+                mode="w+", encoding=self.encoding, dir=self.__temp_path,
+                delete=False)
+
+            ntf.close()
+
+        except FileNotFoundError:
+            # This happens if the directory path is gone preventing the file
+            # from being created...
+            if not _recovery:
+                return self.flush(force=True, _recovery=True)
+
+            # We've already made our best effort to recover if we are here in
+            # our code base... we're going to have to exit
+
+            # Tidy our Named Temporary File
+            _ntf_tidy(ntf)
+
+            # Early Exit
+            return False
+
+        except OSError as e:
+            logger.error(
+                'Persistent temporary directory inaccessible: %s',
+                self.__temp_path)
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+
+            # Tidy our Named Temporary File
+            _ntf_tidy(ntf)
+
+            # Early Exit
+            return False
+
+        try:
+            # write our content currently saved to disk to our temporary file
+            with gzip.open(ntf.name, 'wb') as f:
+                # Write our content to disk
+                f.write(json.dumps(
+                    {k: v for k, v in self._cache.items()
+                     if v and v.persistent},
+                    separators=(',', ':'),
+                    cls=CacheJSONEncoder).encode(self.encoding))
+
+        except TypeError as e:
+            # JSON object contains content that can not be encoded to disk
+            logger.error(
+                'Persistent temporary file can not be written to '
+                'due to bad input data: %s', ntf.name)
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+
+            # Tidy our Named Temporary File
+            _ntf_tidy(ntf)
+
+            # Early Exit
+            return False
+
+        except (OSError, EOFError, zlib.error) as e:
+            logger.error(
+                'Persistent temporary file inaccessible: %s',
+                ntf.name)
+            logger.debug('Persistent Storage Exception: %s' % str(e))
+
+            # Tidy our Named Temporary File
+            _ntf_tidy(ntf)
+
+            # Early Exit
+            return False
+
+        if not self.__move(ntf.name, cache_file):
+            # Attempt to restore things as they were
+
+            # Tidy our Named Temporary File
+            _ntf_tidy(ntf)
+            return False
+
+        # Ensure our dirty flag is set to False
+        self.__dirty = False
+
+        return True
+
+    def files(self, exclude=True, lazy=True):
+        """
+        Returns the total files
+        """
+
+        if lazy and exclude in self.__cache_files:
+            # Take an early exit with our cached results
+            return self.__cache_files[exclude]
+
+        elif self.__mode == PersistentStoreMode.MEMORY:
+            # Take an early exit
+            # exclude is our cache switch and can be either True or False.
+            # For the below, we just set both cases and set them up as an
+            # empty record
+            self.__cache_files.update({True: [], False: []})
+            return []
+
+        if not lazy or self.__exclude_list is None:
+            # A list of criteria that should be excluded from the size count
+            self.__exclude_list = (
+                # Exclude backup cache file from count
+                re.compile(re.escape(os.path.join(
+                    self.__base_path,
+                    f'{self.__cache_key}{self.__backup_extension}'))),
+
+                # Exclude temporary files
+                re.compile(re.escape(self.__temp_path) + r'[/\\].+'),
+
+                # Exclude custom backup persistent files
+                re.compile(
+                    re.escape(self.__data_path) + r'[/\\].+' + re.escape(
+                        self.__backup_extension)),
+            )
+
+        try:
+            if exclude:
+                self.__cache_files[exclude] = \
+                    [path for path in filter(os.path.isfile, glob.glob(
+                        os.path.join(self.__base_path, '**', '*'),
+                        recursive=True))
+                        if next((False for p in self.__exclude_list
+                                 if p.match(path)), True)]
+
+            else:  # No exclusion list applied
+                self.__cache_files[exclude] = \
+                    [path for path in filter(os.path.isfile, glob.glob(
+                        os.path.join(self.__base_path, '**', '*'),
+                        recursive=True))]
+
+        except (OSError, IOError):
+            # We can't access the directory or it does not exist
+            self.__cache_files[exclude] = []
+
+        return self.__cache_files[exclude]
+
+    @staticmethod
+    def disk_scan(path, namespace=None, closest=True):
+        """
+        Scansk a path provided and returns namespaces detected
+        """
+
+        logger.trace('Persistent path can of: %s', path)
+
+        def is_namespace(x):
+            """
+            Validate what was detected is a valid namespace
+            """
+            return os.path.isdir(os.path.join(path, x)) \
+                and PersistentStore.__valid_key.match(x)
+
+        # Handle our namespace searching
+        if namespace:
+            if isinstance(namespace, str):
+                namespace = [namespace]
+
+            elif not isinstance(namespace, (tuple, set, list)):
+                raise AttributeError(
+                    "namespace must be None, a string, or a tuple/set/list "
+                    "of strings")
+
+        try:
+            # Acquire all of the files in question
+            namespaces = \
+                [ns for ns in filter(is_namespace, os.listdir(path))
+                 if not namespace or next(
+                     (True for n in namespace if ns.startswith(n)), False)] \
+                if closest else  \
+                [ns for ns in filter(is_namespace, os.listdir(path))
+                 if not namespace or ns in namespace]
+
+        except FileNotFoundError:
+            # no worries; Nothing to do
+            logger.debug('Disk Prune path not found; nothing to clean.')
+            return []
+
+        except (OSError, IOError) as e:
+            # Permission error of some kind or disk problem...
+            # There is nothing we can do at this point
+            logger.error(
+                'Disk Scan detetcted inaccessible path: %s', path)
+            logger.debug(
+                'Persistent Storage Exception: %s' % str(e))
+            return []
+
+        return namespaces
+
+    @staticmethod
+    def disk_prune(path, namespace=None, expires=None, action=False):
+        """
+        Prune persistent disk storage entries that are old and/or unreferenced
+
+        you must specify a path to perform the prune within
+
+        if one or more namespaces are provided, then pruning focuses ONLY on
+        those entries (if matched).
+
+        if action is not set to False, directories to be removed are returned
+        only
+
+        """
+
+        # Prepare our File Expiry
+        expires = datetime.now() - timedelta(seconds=expires) \
+            if isinstance(expires, (float, int)) and expires >= 0 \
+            else PersistentStore.default_file_expiry
+
+        # Get our namespaces
+        namespaces = PersistentStore.disk_scan(path, namespace)
+
+        # Track matches
+        _map = {}
+
+        for namespace in namespaces:
+            # Prepare our map
+            _map[namespace] = []
+
+            # Reference Directories
+            base_dir = os.path.join(path, namespace)
+            data_dir = os.path.join(base_dir, PersistentStore.data_dir)
+            temp_dir = os.path.join(base_dir, PersistentStore.temp_dir)
+
+            # Careful to only focus on files created by this Persistent Store
+            # object
+            files = [
+                os.path.join(base_dir, f'{PersistentStore.__cache_key}'
+                             f'{PersistentStore.__extension}'),
+                os.path.join(base_dir, f'{PersistentStore.__cache_key}'
+                             f'{PersistentStore.__backup_extension}'),
+            ]
+
+            # Update our files (applying what was defined above too)
+            valid_data_re = re.compile(
+                r'.*(' + re.escape(PersistentStore.__extension) +
+                r'|' + re.escape(PersistentStore.__backup_extension) + r')$')
+
+            files = [path for path in filter(
+                os.path.isfile, chain(glob.glob(
+                    os.path.join(data_dir, '*'), recursive=False), files))
+                if valid_data_re.match(path)]
+
+            # Now all temporary files
+            files.extend([path for path in filter(
+                os.path.isfile, glob.glob(
+                    os.path.join(temp_dir, '*'), recursive=False))])
+
+            # Track if we should do a directory sweep later on
+            dir_sweep = True
+
+            # Scan our files
+            for file in files:
+                try:
+                    mtime = datetime.fromtimestamp(os.path.getmtime(file))
+
+                except FileNotFoundError:
+                    # no worries; we were removing it anyway
+                    continue
+
+                except (OSError, IOError) as e:
+                    # Permission error of some kind or disk problem...
+                    # There is nothing we can do at this point
+                    logger.error(
+                        'Disk Prune (ns=%s, clean=%s) detetcted inaccessible '
+                        'file: %s', namespace, 'yes' if action else 'no', file)
+                    logger.debug(
+                        'Persistent Storage Exception: %s' % str(e))
+
+                    # No longer worth doing a directory sweep
+                    dir_sweep = False
+                    continue
+
+                if expires < mtime:
+                    continue
+
+                #
+                # Handle Removing
+                #
+                record = {
+                    'path': file,
+                    'removed': False,
+                }
+
+                if action:
+                    try:
+                        os.unlink(file)
+                        # Update our record
+                        record['removed'] = True
+                        logger.info(
+                            'Disk Prune (ns=%s, clean=%s) removed persistent '
+                            'file: %s', namespace,
+                            'yes' if action else 'no', file)
+
+                    except FileNotFoundError:
+                        # no longer worth doing a directory sweep
+                        dir_sweep = False
+
+                        # otherwise, no worries; we were removing the file
+                        # anyway
+
+                    except (OSError, IOError) as e:
+                        # Permission error of some kind or disk problem...
+                        # There is nothing we can do at this point
+                        logger.error(
+                            'Disk Prune (ns=%s, clean=%s) failed to remove '
+                            'persistent file: %s', namespace,
+                            'yes' if action else 'no', file)
+
+                        logger.debug(
+                            'Persistent Storage Exception: %s' % str(e))
+
+                        # No longer worth doing a directory sweep
+                        dir_sweep = False
+
+                # Store our record
+                _map[namespace].append(record)
+
+            # Memory tidy
+            del files
+
+            if dir_sweep:
+                # Gracefully cleanup our namespace directory. It's okay if we
+                # fail; This just means there were files in the directory.
+                for dirpath in (temp_dir, data_dir, base_dir):
+                    if action:
+                        try:
+                            os.rmdir(dirpath)
+                            logger.info(
+                                'Disk Prune (ns=%s, clean=%s) removed '
+                                'persistent dir: %s', namespace,
+                                'yes' if action else 'no', dirpath)
+                        except OSError:
+                            # do nothing;
+                            pass
+        return _map
+
+    def size(self, exclude=True, lazy=True):
+        """
+        Returns the total size of the persistent storage in bytes
+        """
+
+        if lazy and self.__cache_size is not None:
+            # Take an early exit
+            return self.__cache_size
+
+        elif self.__mode == PersistentStoreMode.MEMORY:
+            # Take an early exit
+            self.__cache_size = 0
+            return self.__cache_size
+
+        # Get a list of files (file paths) in the given directory
+        try:
+            self.__cache_size = sum(
+                [os.stat(path).st_size for path in
+                    self.files(exclude=exclude, lazy=lazy)])
+
+        except (OSError, IOError):
+            # We can't access the directory or it does not exist
+            self.__cache_size = 0
+
+        return self.__cache_size
+
+    def __del__(self):
+        """
+        Deconstruction of our object
+        """
+
+        if self.__mode == PersistentStoreMode.AUTO:
+            # Flush changes to disk
+            self.flush()
+
+    def __delitem__(self, key):
+        """
+        Remove a cache entry by it's key
+        """
+        if self._cache is None and not self.__load_cache():
+            raise KeyError("Could not initialize cache")
+
+        try:
+            if self._cache[key].persistent:
+                # Set our dirty flag in advance
+                self.__dirty = True
+
+            # Store our new cache
+            del self._cache[key]
+
+        except KeyError:
+            # Nothing to do
+            raise
+
+        if self.__dirty and self.__mode == PersistentStoreMode.FLUSH:
+            # Flush changes to disk
+            self.flush()
+
+        return
+
+    def __contains__(self, key):
+        """
+        Verify if our storage contains the key specified or not.
+        In additiont to this, if the content is expired, it is considered
+        to be not contained in the storage.
+        """
+        if self._cache is None and not self.__load_cache():
+            return False
+
+        return key in self._cache and self._cache[key]
+
+    def __setitem__(self, key, value):
+        """
+        Sets a cache value without disrupting existing settings in place
+        """
+
+        if self._cache is None and not self.__load_cache():
+            raise KeyError("Could not initialize cache")
+
+        if key not in self._cache and not self.set(key, value):
+            raise KeyError("Could not set cache")
+
+        else:
+            # Update our value
+            self._cache[key].set(value)
+
+            if self._cache[key].persistent:
+                # Set our dirty flag in advance
+                self.__dirty = True
+
+        if self.__dirty and self.__mode == PersistentStoreMode.FLUSH:
+            # Flush changes to disk
+            self.flush()
+
+        return
+
+    def __getitem__(self, key):
+        """
+        Returns the indexed value
+        """
+
+        if self._cache is None and not self.__load_cache():
+            raise KeyError("Could not initialize cache")
+
+        result = self.get(key, default=self.__not_found_ref, lazy=False)
+        if result is self.__not_found_ref:
+            raise KeyError(f" {key} not found in cache")
+
+        return result
+
+    def keys(self):
+        """
+        Returns our keys
+        """
+        if self._cache is None and not self.__load_cache():
+            # There are no keys to return
+            return {}.keys()
+
+        return self._cache.keys()
+
+    def delete(self, *args, all=None, temp=None, cache=None, validate=True):
+        """
+        Manages our file space and tidys it up
+
+        delete('key', 'key2')
+        delete(all=True)
+        delete(temp=True, cache=True)
+        """
+
+        # Our failure flag
+        has_error = False
+
+        valid_key_re = re.compile(
+            r'^(?P<key>.+)(' +
+            re.escape(self.__backup_extension) +
+            r'|' + re.escape(self.__extension) + r')$', re.I)
+
+        # Default asignments
+        if all is None:
+            all = True if not (len(args) or temp or cache) else False
+        if temp is None:
+            temp = True if all else False
+        if cache is None:
+            cache = True if all else False
+
+        if cache and self._cache:
+            # Reset our object
+            self._cache.clear()
+            # Reset dirt flag
+            self.__dirty = False
+
+        for path in self.files(exclude=False):
+
+            # Some information we use to validate the actions of our clean()
+            # call. This is so we don't remove anything we shouldn't
+            base = os.path.dirname(path)
+            fname = os.path.basename(path)
+
+            # Clean printable path details
+            ppath = os.path.join(os.path.dirname(base), fname)
+
+            if base == self.__base_path and cache:
+                # We're handling a cache file (hopefully)
+                result = valid_key_re.match(fname)
+                key = None if not result else (
+                    result['key'] if self.__valid_key.match(result['key'])
+                    else None)
+
+                if validate and key != self.__cache_key:
+                    # We're not dealing with a cache key
+                    logger.debug(
+                        'Persistent File cleanup ignoring file: %s', path)
+                    continue
+
+                #
+                # We should proceed with removing the file if we get here
+                #
+
+            elif base == self.__data_path and (args or all):
+                # We're handling a file found in our custom data path
+                result = valid_key_re.match(fname)
+                key = None if not result else (
+                    result['key'] if self.__valid_key.match(result['key'])
+                    else None)
+
+                if validate and key is None:
+                    # we're set to validate and a non-valid file was found
+                    logger.debug(
+                        'Persistent File cleanup ignoring file: %s', path)
+                    continue
+
+                elif not all and (key is None or key not in args):
+                    # no match found
+                    logger.debug(
+                        'Persistent File cleanup ignoring file: %s', path)
+                    continue
+
+                #
+                # We should proceed with removing the file if we get here
+                #
+
+            elif base == self.__temp_path and temp:
+                #
+                # This directory is a temporary path and nothing in here needs
+                # to be further verified. Proceed with the removing of the file
+                #
+                pass
+
+            else:
+                # No match; move on
+                logger.debug('Persistent File cleanup ignoring file: %s', path)
+                continue
+
+            try:
+                os.unlink(path)
+                logger.info('Removed persistent file: %s', ppath)
+
+            except FileNotFoundError:
+                # no worries; we were removing it anyway
+                pass
+
+            except (OSError, IOError) as e:
+                # Permission error of some kind or disk problem...
+                # There is nothing we can do at this point
+                has_error = True
+                logger.error(
+                    'Failed to remove persistent file: %s', ppath)
+                logger.debug('Persistent Storage Exception: %s' % str(e))
+
+        # Reset our reference variables
+        self.__cache_size = None
+        self.__cache_files.clear()
+
+        return not has_error
+
+    @property
+    def cache_file(self):
+        """
+        Returns the full path to the namespace directory
+        """
+        return os.path.join(
+            self.__base_path,
+            f'{self.__cache_key}{self.__extension}',
+        )
+
+    @property
+    def path(self):
+        """
+        Returns the full path to the namespace directory
+        """
+        return self.__base_path
+
+    @property
+    def mode(self):
+        """
+        Returns the full path to the namespace directory
+        """
+        return self.__mode
diff --git a/apprise/plugins/africas_talking.py b/apprise/plugins/africas_talking.py
index 6d67e510..af8a7857 100644
--- a/apprise/plugins/africas_talking.py
+++ b/apprise/plugins/africas_talking.py
@@ -354,6 +354,15 @@ class NotifyAfricasTalking(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.appuser, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/aprs.py b/apprise/plugins/aprs.py
index b8adef5a..d87025fe 100644
--- a/apprise/plugins/aprs.py
+++ b/apprise/plugins/aprs.py
@@ -729,6 +729,15 @@ class NotifyAprs(NotifyBase):
             params=NotifyAprs.urlencode(params),
         )
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.user, self.password, self.locale)
+
     def __len__(self):
         """
         Returns the number of targets associated with this notification
diff --git a/apprise/plugins/bark.py b/apprise/plugins/bark.py
index e2f5bbfb..e676e0c3 100644
--- a/apprise/plugins/bark.py
+++ b/apprise/plugins/bark.py
@@ -395,6 +395,18 @@ class NotifyBark(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host, self.port,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/base.py b/apprise/plugins/base.py
index d18f0af0..8e142be9 100644
--- a/apprise/plugins/base.py
+++ b/apprise/plugins/base.py
@@ -38,7 +38,9 @@ from ..common import NotifyFormat
 from ..common import NOTIFY_FORMATS
 from ..common import OverflowMode
 from ..common import OVERFLOW_MODES
+from ..common import PersistentStoreMode
 from ..locale import gettext_lazy as _
+from ..persistent_store import PersistentStore
 from ..apprise_attachment import AppriseAttachment
 
 
@@ -130,12 +132,19 @@ class NotifyBase(URLBase):
     # of lines. Setting this to zero disables this feature.
     body_max_line_count = 0
 
+    # Persistent storage default html settings
+    persistent_storage = True
+
     # Default Notify Format
     notify_format = NotifyFormat.TEXT
 
     # Default Overflow Mode
     overflow_mode = OverflowMode.UPSTREAM
 
+    # Our default is to no not use persistent storage beyond in-memory
+    # reference
+    storage_mode = PersistentStoreMode.MEMORY
+
     # Default Emoji Interpretation
     interpret_emojis = False
 
@@ -197,6 +206,16 @@ class NotifyBase(URLBase):
             # runtime.
             '_lookup_default': 'interpret_emojis',
         },
+        'store': {
+            'name': _('Persistent Storage'),
+            # Use Persistent Storage
+            'type': 'bool',
+            # Provide a default
+            'default': persistent_storage,
+            # look up default using the following parent class value at
+            # runtime.
+            '_lookup_default': 'persistent_storage',
+        },
     })
 
     #
@@ -268,6 +287,9 @@ class NotifyBase(URLBase):
         # are turned off (no user over-rides allowed)
         #
 
+        # Our Persistent Storage object is initialized on demand
+        self.__store = None
+
         # Take a default
         self.interpret_emojis = self.asset.interpret_emojis
         if 'emojis' in kwargs:
@@ -301,6 +323,14 @@ class NotifyBase(URLBase):
             # Provide override
             self.overflow_mode = overflow
 
+        # Prepare our Persistent Storage switch
+        self.persistent_storage = parse_bool(
+            kwargs.get('store', NotifyBase.persistent_storage))
+        if not self.persistent_storage:
+            # Enforce the disabling of cache (ortherwise defaults are use)
+            self.url_identifier = False
+            self.__cached_url_identifier = None
+
     def image_url(self, notify_type, logo=False, extension=None,
                   image_size=None):
         """
@@ -726,6 +756,10 @@ class NotifyBase(URLBase):
             'overflow': self.overflow_mode,
         }
 
+        # Persistent Storage Setting
+        if self.persistent_storage != NotifyBase.persistent_storage:
+            params['store'] = 'yes' if self.persistent_storage else 'no'
+
         params.update(super().url_parameters(*args, **kwargs))
 
         # return default parameters
@@ -778,6 +812,10 @@ class NotifyBase(URLBase):
         # Allow emoji's override
         if 'emojis' in results['qsd']:
             results['emojis'] = parse_bool(results['qsd'].get('emojis'))
+            # Store our persistent storage boolean
+
+        if 'store' in results['qsd']:
+            results['store'] = results['qsd']['store']
 
         return results
 
@@ -798,3 +836,29 @@ class NotifyBase(URLBase):
         should return the same set of results that parse_url() does.
         """
         return None
+
+    @property
+    def store(self):
+        """
+        Returns a pointer to our persistent store for use.
+
+          The best use cases are:
+           self.store.get('key')
+           self.store.set('key', 'value')
+           self.store.delete('key1', 'key2', ...)
+
+          You can also access the keys this way:
+           self.store['key']
+
+          And clear them:
+           del self.store['key']
+
+        """
+        if self.__store is None:
+            # Initialize our persistent store for use
+            self.__store = PersistentStore(
+                namespace=self.url_id(),
+                path=self.asset.storage_path,
+                mode=self.asset.storage_mode)
+
+        return self.__store
diff --git a/apprise/plugins/boxcar.py b/apprise/plugins/boxcar.py
index 851cdd3d..f7f16b04 100644
--- a/apprise/plugins/boxcar.py
+++ b/apprise/plugins/boxcar.py
@@ -341,6 +341,15 @@ class NotifyBoxcar(NotifyBase):
             params=NotifyBoxcar.urlencode(params),
         )
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.access, self.secret)
+
     def __len__(self):
         """
         Returns the number of targets associated with this notification
diff --git a/apprise/plugins/bulksms.py b/apprise/plugins/bulksms.py
index 9bbbefe5..f53f6126 100644
--- a/apprise/plugins/bulksms.py
+++ b/apprise/plugins/bulksms.py
@@ -413,6 +413,19 @@ class NotifyBulkSMS(NotifyBase):
                  for x in self.groups])),
             params=NotifyBulkSMS.urlencode(params))
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol,
+            self.user if self.user else None,
+            self.password if self.password else None,
+        )
+
     def __len__(self):
         """
         Returns the number of targets associated with this notification
diff --git a/apprise/plugins/bulkvs.py b/apprise/plugins/bulkvs.py
index 53a36300..a02d8ab8 100644
--- a/apprise/plugins/bulkvs.py
+++ b/apprise/plugins/bulkvs.py
@@ -304,6 +304,15 @@ class NotifyBulkVS(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.source, self.user, self.password)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/burstsms.py b/apprise/plugins/burstsms.py
index eb19df8e..3b6f2669 100644
--- a/apprise/plugins/burstsms.py
+++ b/apprise/plugins/burstsms.py
@@ -378,6 +378,15 @@ class NotifyBurstSMS(NotifyBase):
                 [NotifyBurstSMS.quote(x, safe='') for x in self.targets]),
             params=NotifyBurstSMS.urlencode(params))
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.apikey, self.secret, self.source)
+
     def __len__(self):
         """
         Returns the number of targets associated with this notification
diff --git a/apprise/plugins/chantify.py b/apprise/plugins/chantify.py
index d549a59f..e7c5f63e 100644
--- a/apprise/plugins/chantify.py
+++ b/apprise/plugins/chantify.py
@@ -181,6 +181,15 @@ class NotifyChantify(NotifyBase):
             params=NotifyChantify.urlencode(params),
         )
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.token)
+
     @staticmethod
     def parse_url(url):
         """
diff --git a/apprise/plugins/clicksend.py b/apprise/plugins/clicksend.py
index 9ade1055..7f28ac91 100644
--- a/apprise/plugins/clicksend.py
+++ b/apprise/plugins/clicksend.py
@@ -285,6 +285,15 @@ class NotifyClickSend(NotifyBase):
             params=NotifyClickSend.urlencode(params),
         )
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.user, self.password)
+
     def __len__(self):
         """
         Returns the number of targets associated with this notification
diff --git a/apprise/plugins/custom_form.py b/apprise/plugins/custom_form.py
index 0f36643f..05fe51d1 100644
--- a/apprise/plugins/custom_form.py
+++ b/apprise/plugins/custom_form.py
@@ -272,62 +272,6 @@ class NotifyForm(NotifyBase):
 
         return
 
-    def url(self, privacy=False, *args, **kwargs):
-        """
-        Returns the URL built dynamically based on specified arguments.
-        """
-
-        # Define any URL parameters
-        params = {
-            'method': self.method,
-        }
-
-        # Extend our parameters
-        params.update(self.url_parameters(privacy=privacy, *args, **kwargs))
-
-        # Append our headers into our parameters
-        params.update({'+{}'.format(k): v for k, v in self.headers.items()})
-
-        # Append our GET params into our parameters
-        params.update({'-{}'.format(k): v for k, v in self.params.items()})
-
-        # Append our payload extra's into our parameters
-        params.update(
-            {':{}'.format(k): v for k, v in self.payload_extras.items()})
-        params.update(
-            {':{}'.format(k): v for k, v in self.payload_overrides.items()})
-
-        if self.attach_as != self.attach_as_default:
-            # Provide Attach-As extension details
-            params['attach-as'] = self.attach_as
-
-        # Determine Authentication
-        auth = ''
-        if self.user and self.password:
-            auth = '{user}:{password}@'.format(
-                user=NotifyForm.quote(self.user, safe=''),
-                password=self.pprint(
-                    self.password, privacy, mode=PrivacyMode.Secret, safe=''),
-            )
-        elif self.user:
-            auth = '{user}@'.format(
-                user=NotifyForm.quote(self.user, safe=''),
-            )
-
-        default_port = 443 if self.secure else 80
-
-        return '{schema}://{auth}{hostname}{port}{fullpath}?{params}'.format(
-            schema=self.secure_protocol if self.secure else self.protocol,
-            auth=auth,
-            # never encode hostname since we're expecting it to be a valid one
-            hostname=self.host,
-            port='' if self.port is None or self.port == default_port
-                 else ':{}'.format(self.port),
-            fullpath=NotifyForm.quote(self.fullpath, safe='/')
-            if self.fullpath else '/',
-            params=NotifyForm.urlencode(params),
-        )
-
     def send(self, body, title='', notify_type=NotifyType.INFO, attach=None,
              **kwargs):
         """
@@ -486,6 +430,76 @@ class NotifyForm(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host,
+            self.port if self.port else (443 if self.secure else 80),
+            self.fullpath.rstrip('/'),
+        )
+
+    def url(self, privacy=False, *args, **kwargs):
+        """
+        Returns the URL built dynamically based on specified arguments.
+        """
+
+        # Define any URL parameters
+        params = {
+            'method': self.method,
+        }
+
+        # Extend our parameters
+        params.update(self.url_parameters(privacy=privacy, *args, **kwargs))
+
+        # Append our headers into our parameters
+        params.update({'+{}'.format(k): v for k, v in self.headers.items()})
+
+        # Append our GET params into our parameters
+        params.update({'-{}'.format(k): v for k, v in self.params.items()})
+
+        # Append our payload extra's into our parameters
+        params.update(
+            {':{}'.format(k): v for k, v in self.payload_extras.items()})
+        params.update(
+            {':{}'.format(k): v for k, v in self.payload_overrides.items()})
+
+        if self.attach_as != self.attach_as_default:
+            # Provide Attach-As extension details
+            params['attach-as'] = self.attach_as
+
+        # Determine Authentication
+        auth = ''
+        if self.user and self.password:
+            auth = '{user}:{password}@'.format(
+                user=NotifyForm.quote(self.user, safe=''),
+                password=self.pprint(
+                    self.password, privacy, mode=PrivacyMode.Secret, safe=''),
+            )
+        elif self.user:
+            auth = '{user}@'.format(
+                user=NotifyForm.quote(self.user, safe=''),
+            )
+
+        default_port = 443 if self.secure else 80
+
+        return '{schema}://{auth}{hostname}{port}{fullpath}?{params}'.format(
+            schema=self.secure_protocol if self.secure else self.protocol,
+            auth=auth,
+            # never encode hostname since we're expecting it to be a valid one
+            hostname=self.host,
+            port='' if self.port is None or self.port == default_port
+                 else ':{}'.format(self.port),
+            fullpath=NotifyForm.quote(self.fullpath, safe='/')
+            if self.fullpath else '/',
+            params=NotifyForm.urlencode(params),
+        )
+
     @staticmethod
     def parse_url(url):
         """
diff --git a/apprise/plugins/custom_json.py b/apprise/plugins/custom_json.py
index e0d7a675..25b4467d 100644
--- a/apprise/plugins/custom_json.py
+++ b/apprise/plugins/custom_json.py
@@ -195,56 +195,6 @@ class NotifyJSON(NotifyBase):
 
         return
 
-    def url(self, privacy=False, *args, **kwargs):
-        """
-        Returns the URL built dynamically based on specified arguments.
-        """
-
-        # Define any URL parameters
-        params = {
-            'method': self.method,
-        }
-
-        # Extend our parameters
-        params.update(self.url_parameters(privacy=privacy, *args, **kwargs))
-
-        # Append our headers into our parameters
-        params.update({'+{}'.format(k): v for k, v in self.headers.items()})
-
-        # Append our GET params into our parameters
-        params.update({'-{}'.format(k): v for k, v in self.params.items()})
-
-        # Append our payload extra's into our parameters
-        params.update(
-            {':{}'.format(k): v for k, v in self.payload_extras.items()})
-
-        # Determine Authentication
-        auth = ''
-        if self.user and self.password:
-            auth = '{user}:{password}@'.format(
-                user=NotifyJSON.quote(self.user, safe=''),
-                password=self.pprint(
-                    self.password, privacy, mode=PrivacyMode.Secret, safe=''),
-            )
-        elif self.user:
-            auth = '{user}@'.format(
-                user=NotifyJSON.quote(self.user, safe=''),
-            )
-
-        default_port = 443 if self.secure else 80
-
-        return '{schema}://{auth}{hostname}{port}{fullpath}?{params}'.format(
-            schema=self.secure_protocol if self.secure else self.protocol,
-            auth=auth,
-            # never encode hostname since we're expecting it to be a valid one
-            hostname=self.host,
-            port='' if self.port is None or self.port == default_port
-                 else ':{}'.format(self.port),
-            fullpath=NotifyJSON.quote(self.fullpath, safe='/')
-            if self.fullpath else '/',
-            params=NotifyJSON.urlencode(params),
-        )
-
     def send(self, body, title='', notify_type=NotifyType.INFO, attach=None,
              **kwargs):
         """
@@ -395,6 +345,70 @@ class NotifyJSON(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host,
+            self.port if self.port else (443 if self.secure else 80),
+            self.fullpath.rstrip('/'),
+        )
+
+    def url(self, privacy=False, *args, **kwargs):
+        """
+        Returns the URL built dynamically based on specified arguments.
+        """
+
+        # Define any URL parameters
+        params = {
+            'method': self.method,
+        }
+
+        # Extend our parameters
+        params.update(self.url_parameters(privacy=privacy, *args, **kwargs))
+
+        # Append our headers into our parameters
+        params.update({'+{}'.format(k): v for k, v in self.headers.items()})
+
+        # Append our GET params into our parameters
+        params.update({'-{}'.format(k): v for k, v in self.params.items()})
+
+        # Append our payload extra's into our parameters
+        params.update(
+            {':{}'.format(k): v for k, v in self.payload_extras.items()})
+
+        # Determine Authentication
+        auth = ''
+        if self.user and self.password:
+            auth = '{user}:{password}@'.format(
+                user=NotifyJSON.quote(self.user, safe=''),
+                password=self.pprint(
+                    self.password, privacy, mode=PrivacyMode.Secret, safe=''),
+            )
+        elif self.user:
+            auth = '{user}@'.format(
+                user=NotifyJSON.quote(self.user, safe=''),
+            )
+
+        default_port = 443 if self.secure else 80
+
+        return '{schema}://{auth}{hostname}{port}{fullpath}?{params}'.format(
+            schema=self.secure_protocol if self.secure else self.protocol,
+            auth=auth,
+            # never encode hostname since we're expecting it to be a valid one
+            hostname=self.host,
+            port='' if self.port is None or self.port == default_port
+                 else ':{}'.format(self.port),
+            fullpath=NotifyJSON.quote(self.fullpath, safe='/')
+            if self.fullpath else '/',
+            params=NotifyJSON.urlencode(params),
+        )
+
     @staticmethod
     def parse_url(url):
         """
diff --git a/apprise/plugins/custom_xml.py b/apprise/plugins/custom_xml.py
index b7928fce..f72e9a1a 100644
--- a/apprise/plugins/custom_xml.py
+++ b/apprise/plugins/custom_xml.py
@@ -242,58 +242,6 @@ class NotifyXML(NotifyBase):
 
         return
 
-    def url(self, privacy=False, *args, **kwargs):
-        """
-        Returns the URL built dynamically based on specified arguments.
-        """
-
-        # Define any URL parameters
-        params = {
-            'method': self.method,
-        }
-
-        # Extend our parameters
-        params.update(self.url_parameters(privacy=privacy, *args, **kwargs))
-
-        # Append our headers into our parameters
-        params.update({'+{}'.format(k): v for k, v in self.headers.items()})
-
-        # Append our GET params into our parameters
-        params.update({'-{}'.format(k): v for k, v in self.params.items()})
-
-        # Append our payload extra's into our parameters
-        params.update(
-            {':{}'.format(k): v for k, v in self.payload_extras.items()})
-        params.update(
-            {':{}'.format(k): v for k, v in self.payload_overrides.items()})
-
-        # Determine Authentication
-        auth = ''
-        if self.user and self.password:
-            auth = '{user}:{password}@'.format(
-                user=NotifyXML.quote(self.user, safe=''),
-                password=self.pprint(
-                    self.password, privacy, mode=PrivacyMode.Secret, safe=''),
-            )
-        elif self.user:
-            auth = '{user}@'.format(
-                user=NotifyXML.quote(self.user, safe=''),
-            )
-
-        default_port = 443 if self.secure else 80
-
-        return '{schema}://{auth}{hostname}{port}{fullpath}?{params}'.format(
-            schema=self.secure_protocol if self.secure else self.protocol,
-            auth=auth,
-            # never encode hostname since we're expecting it to be a valid one
-            hostname=self.host,
-            port='' if self.port is None or self.port == default_port
-                 else ':{}'.format(self.port),
-            fullpath=NotifyXML.quote(self.fullpath, safe='/')
-            if self.fullpath else '/',
-            params=NotifyXML.urlencode(params),
-        )
-
     def send(self, body, title='', notify_type=NotifyType.INFO, attach=None,
              **kwargs):
         """
@@ -467,6 +415,72 @@ class NotifyXML(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host,
+            self.port if self.port else (443 if self.secure else 80),
+            self.fullpath.rstrip('/'),
+        )
+
+    def url(self, privacy=False, *args, **kwargs):
+        """
+        Returns the URL built dynamically based on specified arguments.
+        """
+
+        # Define any URL parameters
+        params = {
+            'method': self.method,
+        }
+
+        # Extend our parameters
+        params.update(self.url_parameters(privacy=privacy, *args, **kwargs))
+
+        # Append our headers into our parameters
+        params.update({'+{}'.format(k): v for k, v in self.headers.items()})
+
+        # Append our GET params into our parameters
+        params.update({'-{}'.format(k): v for k, v in self.params.items()})
+
+        # Append our payload extra's into our parameters
+        params.update(
+            {':{}'.format(k): v for k, v in self.payload_extras.items()})
+        params.update(
+            {':{}'.format(k): v for k, v in self.payload_overrides.items()})
+
+        # Determine Authentication
+        auth = ''
+        if self.user and self.password:
+            auth = '{user}:{password}@'.format(
+                user=NotifyXML.quote(self.user, safe=''),
+                password=self.pprint(
+                    self.password, privacy, mode=PrivacyMode.Secret, safe=''),
+            )
+        elif self.user:
+            auth = '{user}@'.format(
+                user=NotifyXML.quote(self.user, safe=''),
+            )
+
+        default_port = 443 if self.secure else 80
+
+        return '{schema}://{auth}{hostname}{port}{fullpath}?{params}'.format(
+            schema=self.secure_protocol if self.secure else self.protocol,
+            auth=auth,
+            # never encode hostname since we're expecting it to be a valid one
+            hostname=self.host,
+            port='' if self.port is None or self.port == default_port
+                 else ':{}'.format(self.port),
+            fullpath=NotifyXML.quote(self.fullpath, safe='/')
+            if self.fullpath else '/',
+            params=NotifyXML.urlencode(params),
+        )
+
     @staticmethod
     def parse_url(url):
         """
diff --git a/apprise/plugins/d7networks.py b/apprise/plugins/d7networks.py
index ad55e219..ff2e31b0 100644
--- a/apprise/plugins/d7networks.py
+++ b/apprise/plugins/d7networks.py
@@ -354,6 +354,15 @@ class NotifyD7Networks(NotifyBase):
                 [NotifyD7Networks.quote(x, safe='') for x in self.targets]),
             params=NotifyD7Networks.urlencode(params))
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.token)
+
     def __len__(self):
         """
         Returns the number of targets associated with this notification
diff --git a/apprise/plugins/dapnet.py b/apprise/plugins/dapnet.py
index 60a18acd..725174c1 100644
--- a/apprise/plugins/dapnet.py
+++ b/apprise/plugins/dapnet.py
@@ -346,6 +346,15 @@ class NotifyDapnet(NotifyBase):
             params=NotifyDapnet.urlencode(params),
         )
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.user, self.password)
+
     def __len__(self):
         """
         Returns the number of targets associated with this notification
diff --git a/apprise/plugins/dbus.py b/apprise/plugins/dbus.py
index f2361fd6..9a22a85f 100644
--- a/apprise/plugins/dbus.py
+++ b/apprise/plugins/dbus.py
@@ -173,7 +173,6 @@ class NotifyDBus(NotifyBase):
     # object if we were to reference, we wouldn't be backwards compatible with
     # Python v2.  So converting the result set back into a list makes us
     # compatible
-    # TODO: Review after dropping support for Python 2.
     protocol = list(MAINLOOP_MAP.keys())
 
     # A URL that takes you to the setup/help of the specific protocol
@@ -196,6 +195,10 @@ class NotifyDBus(NotifyBase):
     dbus_interface = 'org.freedesktop.Notifications'
     dbus_setting_location = '/org/freedesktop/Notifications'
 
+    # No URL Identifier will be defined for this service as there simply isn't
+    # enough details to uniquely identify one dbus:// from another.
+    url_identifier = False
+
     # Define object templates
     templates = (
         '{schema}://',
diff --git a/apprise/plugins/dingtalk.py b/apprise/plugins/dingtalk.py
index 2ca1bc55..e675f530 100644
--- a/apprise/plugins/dingtalk.py
+++ b/apprise/plugins/dingtalk.py
@@ -310,6 +310,15 @@ class NotifyDingTalk(NotifyBase):
                 [NotifyDingTalk.quote(x, safe='') for x in self.targets]),
             args=NotifyDingTalk.urlencode(args))
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.secret, self.token)
+
     def __len__(self):
         """
         Returns the number of targets associated with this notification
diff --git a/apprise/plugins/discord.py b/apprise/plugins/discord.py
index 14c6152b..e41e22cd 100644
--- a/apprise/plugins/discord.py
+++ b/apprise/plugins/discord.py
@@ -607,6 +607,15 @@ class NotifyDiscord(NotifyBase):
             params=NotifyDiscord.urlencode(params),
         )
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.webhook_id, self.webhook_token)
+
     @staticmethod
     def parse_url(url):
         """
diff --git a/apprise/plugins/email.py b/apprise/plugins/email.py
index 142c93cf..f720c426 100644
--- a/apprise/plugins/email.py
+++ b/apprise/plugins/email.py
@@ -1031,6 +1031,20 @@ class NotifyEmail(NotifyBase):
             params=NotifyEmail.urlencode(params),
         )
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host,
+            self.port if self.port
+            else SECURE_MODES[self.secure_mode]['default_port'],
+        )
+
     def __len__(self):
         """
         Returns the number of targets associated with this notification
diff --git a/apprise/plugins/emby.py b/apprise/plugins/emby.py
index 5e4e0b89..5824932e 100644
--- a/apprise/plugins/emby.py
+++ b/apprise/plugins/emby.py
@@ -593,6 +593,18 @@ class NotifyEmby(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.user, self.password, self.host,
+            self.port if self.port else (443 if self.secure else 80),
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/enigma2.py b/apprise/plugins/enigma2.py
index 8b1fff68..a79d3b57 100644
--- a/apprise/plugins/enigma2.py
+++ b/apprise/plugins/enigma2.py
@@ -181,6 +181,20 @@ class NotifyEnigma2(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol,
+            self.user, self.password, self.host,
+            self.port if self.port else (443 if self.secure else 80),
+            self.fullpath.rstrip('/'),
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/fcm/__init__.py b/apprise/plugins/fcm/__init__.py
index 9dc0679f..e5db817e 100644
--- a/apprise/plugins/fcm/__init__.py
+++ b/apprise/plugins/fcm/__init__.py
@@ -507,6 +507,15 @@ class NotifyFCM(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.mode, self.apikey, self.project)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/feishu.py b/apprise/plugins/feishu.py
index 961523ba..9b3c74ea 100644
--- a/apprise/plugins/feishu.py
+++ b/apprise/plugins/feishu.py
@@ -192,6 +192,15 @@ class NotifyFeishu(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/flock.py b/apprise/plugins/flock.py
index bf2cd131..99ce3582 100644
--- a/apprise/plugins/flock.py
+++ b/apprise/plugins/flock.py
@@ -308,6 +308,15 @@ class NotifyFlock(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/freemobile.py b/apprise/plugins/freemobile.py
index 4ff3d482..5208d566 100644
--- a/apprise/plugins/freemobile.py
+++ b/apprise/plugins/freemobile.py
@@ -103,6 +103,15 @@ class NotifyFreeMobile(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.user, self.password)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/gnome.py b/apprise/plugins/gnome.py
index b64b5130..0a413373 100644
--- a/apprise/plugins/gnome.py
+++ b/apprise/plugins/gnome.py
@@ -132,6 +132,10 @@ class NotifyGnome(NotifyBase):
     # cause any title (if defined) to get placed into the message body.
     title_maxlen = 0
 
+    # No URL Identifier will be defined for this service as there simply isn't
+    # enough details to uniquely identify one dbus:// from another.
+    url_identifier = False
+
     # Define object templates
     templates = (
         '{schema}://',
diff --git a/apprise/plugins/google_chat.py b/apprise/plugins/google_chat.py
index f30cdae4..f12e2402 100644
--- a/apprise/plugins/google_chat.py
+++ b/apprise/plugins/google_chat.py
@@ -265,6 +265,18 @@ class NotifyGoogleChat(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.workspace, self.webhook_key,
+            self.webhook_token,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/gotify.py b/apprise/plugins/gotify.py
index bf6c1b28..1be2d005 100644
--- a/apprise/plugins/gotify.py
+++ b/apprise/plugins/gotify.py
@@ -265,6 +265,20 @@ class NotifyGotify(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host,
+            self.port if self.port else (443 if self.secure else 80),
+            self.fullpath.rstrip('/'),
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/growl.py b/apprise/plugins/growl.py
index 0b367218..e6f6237e 100644
--- a/apprise/plugins/growl.py
+++ b/apprise/plugins/growl.py
@@ -338,6 +338,19 @@ class NotifyGrowl(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host,
+            self.port if self.port else self.default_port,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/home_assistant.py b/apprise/plugins/home_assistant.py
index b0ffcaa6..c59b58a9 100644
--- a/apprise/plugins/home_assistant.py
+++ b/apprise/plugins/home_assistant.py
@@ -179,8 +179,8 @@ class NotifyHomeAssistant(NotifyBase):
         if isinstance(self.port, int):
             url += ':%d' % self.port
 
-        url += '' if not self.fullpath else '/' + self.fullpath.strip('/')
-        url += '/api/services/persistent_notification/create'
+        url += self.fullpath.rstrip('/') + \
+            '/api/services/persistent_notification/create'
 
         self.logger.debug('Home Assistant POST URL: %s (cert_verify=%r)' % (
             url, self.verify_certificate,
@@ -231,6 +231,22 @@ class NotifyHomeAssistant(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host,
+            self.port if self.port else (
+                443 if self.secure else self.default_insecure_port),
+            self.fullpath.rstrip('/'),
+            self.accesstoken,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
@@ -302,7 +318,7 @@ class NotifyHomeAssistant(NotifyBase):
             results['accesstoken'] = fullpath.pop() if fullpath else None
 
             # Re-assemble our full path
-            results['fullpath'] = '/'.join(fullpath)
+            results['fullpath'] = '/' + '/'.join(fullpath) if fullpath else ''
 
         # Allow the specification of a unique notification_id so that
         # it will always replace the last one sent.
diff --git a/apprise/plugins/httpsms.py b/apprise/plugins/httpsms.py
index b36e286d..b4da6c62 100644
--- a/apprise/plugins/httpsms.py
+++ b/apprise/plugins/httpsms.py
@@ -253,6 +253,15 @@ class NotifyHttpSMS(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.source, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/ifttt.py b/apprise/plugins/ifttt.py
index 9d89b146..64d9cc31 100644
--- a/apprise/plugins/ifttt.py
+++ b/apprise/plugins/ifttt.py
@@ -287,6 +287,15 @@ class NotifyIFTTT(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.webhook_id)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/join.py b/apprise/plugins/join.py
index b92bb37a..239f9682 100644
--- a/apprise/plugins/join.py
+++ b/apprise/plugins/join.py
@@ -345,6 +345,15 @@ class NotifyJoin(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/kavenegar.py b/apprise/plugins/kavenegar.py
index e4963f40..ea3bd8ed 100644
--- a/apprise/plugins/kavenegar.py
+++ b/apprise/plugins/kavenegar.py
@@ -304,6 +304,15 @@ class NotifyKavenegar(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.source, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/kumulos.py b/apprise/plugins/kumulos.py
index 504dcc37..941f163d 100644
--- a/apprise/plugins/kumulos.py
+++ b/apprise/plugins/kumulos.py
@@ -198,6 +198,15 @@ class NotifyKumulos(NotifyBase):
             return False
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.apikey, self.serverkey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/lametric.py b/apprise/plugins/lametric.py
index 7fdd8ebc..b6124a91 100644
--- a/apprise/plugins/lametric.py
+++ b/apprise/plugins/lametric.py
@@ -783,6 +783,29 @@ class NotifyLametric(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        if self.mode == LametricMode.DEVICE:
+            return (
+                self.secure_protocol if self.secure else self.protocol,
+                self.user, self.lametric_apikey, self.host,
+                self.port if self.port else (
+                    443 if self.secure else
+                    self.template_tokens['port']['default']),
+            )
+
+        return (
+            self.protocol,
+            self.lametric_app_access_token,
+            self.lametric_app_id,
+            self.lametric_app_ver,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
@@ -871,6 +894,9 @@ class NotifyLametric(NotifyBase):
             results['password'] = results['user']
             results['user'] = None
 
+        # Get unquoted entries
+        entries = NotifyLametric.split_path(results['fullpath'])
+
         # Priority Handling
         if 'priority' in results['qsd'] and results['qsd']['priority']:
             results['priority'] = NotifyLametric.unquote(
@@ -913,6 +939,10 @@ class NotifyLametric(NotifyBase):
             results['app_ver'] = \
                 NotifyLametric.unquote(results['qsd']['app_ver'])
 
+        elif entries:
+            # Store our app id
+            results['app_ver'] = entries.pop(0)
+
         if 'token' in results['qsd'] and results['qsd']['token']:
             # Extract Application Access Token from an argument
             results['app_token'] = \
diff --git a/apprise/plugins/line.py b/apprise/plugins/line.py
index 07a01e76..c177e26f 100644
--- a/apprise/plugins/line.py
+++ b/apprise/plugins/line.py
@@ -241,6 +241,15 @@ class NotifyLine(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/lunasea.py b/apprise/plugins/lunasea.py
index 2af51917..97291fc9 100644
--- a/apprise/plugins/lunasea.py
+++ b/apprise/plugins/lunasea.py
@@ -324,6 +324,24 @@ class NotifyLunaSea(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        secure = self.secure_protocol[0] \
+            if self.mode == LunaSeaMode.CLOUD else (
+                self.secure_protocol[0] if self.secure else self.protocol[0])
+        return (
+            secure,
+            self.host if self.mode == LunaSeaMode.PRIVATE else None,
+            self.port if self.port else (443 if self.secure else 80),
+            self.user if self.user else None,
+            self.password if self.password else None,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/macosx.py b/apprise/plugins/macosx.py
index 31b7101b..153a88a2 100644
--- a/apprise/plugins/macosx.py
+++ b/apprise/plugins/macosx.py
@@ -92,6 +92,10 @@ class NotifyMacOSX(NotifyBase):
     # content to display
     body_max_line_count = 10
 
+    # No URL Identifier will be defined for this service as there simply isn't
+    # enough details to uniquely identify one dbus:// from another.
+    url_identifier = False
+
     # The possible paths to the terminal-notifier
     notify_paths = (
         '/opt/homebrew/bin/terminal-notifier',
diff --git a/apprise/plugins/mailgun.py b/apprise/plugins/mailgun.py
index 69ab72dd..4b73957a 100644
--- a/apprise/plugins/mailgun.py
+++ b/apprise/plugins/mailgun.py
@@ -579,6 +579,17 @@ class NotifyMailgun(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.host, self.apikey, self.region_name,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/mastodon.py b/apprise/plugins/mastodon.py
index b6e451ad..85379ba0 100644
--- a/apprise/plugins/mastodon.py
+++ b/apprise/plugins/mastodon.py
@@ -336,6 +336,18 @@ class NotifyMastodon(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol[0], self.token, self.host,
+            self.port if self.port else (443 if self.secure else 80),
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/matrix.py b/apprise/plugins/matrix.py
index 70a40987..bb9c6dbb 100644
--- a/apprise/plugins/matrix.py
+++ b/apprise/plugins/matrix.py
@@ -42,6 +42,7 @@ from ..url import PrivacyMode
 from ..common import NotifyType
 from ..common import NotifyImageSize
 from ..common import NotifyFormat
+from ..common import PersistentStoreMode
 from ..utils import parse_bool
 from ..utils import parse_list
 from ..utils import is_hostname
@@ -175,6 +176,13 @@ class NotifyMatrix(NotifyBase):
     # the server doesn't remind us how long we shoul wait for
     default_wait_ms = 1000
 
+    # Our default is to no not use persistent storage beyond in-memory
+    # reference
+    storage_mode = PersistentStoreMode.AUTO
+
+    # Keep our cache for 20 days
+    default_cache_expiry_sec = 60 * 60 * 24 * 20
+
     # Define object templates
     templates = (
         # Targets are ignored when using t2bot mode; only a token is required
@@ -299,10 +307,6 @@ class NotifyMatrix(NotifyBase):
         # Place an image inline with the message body
         self.include_image = include_image
 
-        # maintain a lookup of room alias's we already paired with their id
-        # to speed up future requests
-        self._room_cache = {}
-
         # Setup our mode
         self.mode = self.template_args['mode']['default'] \
             if not isinstance(mode, str) else mode.lower()
@@ -342,6 +346,7 @@ class NotifyMatrix(NotifyBase):
                   .format(self.host)
             self.logger.warning(msg)
             raise TypeError(msg)
+
         else:
             # Verify port if specified
             if self.port is not None and not (
@@ -353,6 +358,23 @@ class NotifyMatrix(NotifyBase):
                 self.logger.warning(msg)
                 raise TypeError(msg)
 
+        #
+        # Initialize from cache if present
+        #
+        if self.mode != MatrixWebhookMode.T2BOT:
+            # our home server gets populated after a login/registration
+            self.home_server = self.store.get('home_server')
+
+            # our user_id gets populated after a login/registration
+            self.user_id = self.store.get('user_id')
+
+            # This gets initialized after a login/registration
+            self.access_token = self.store.get('access_token')
+
+        # This gets incremented for each request made against the v3 API
+        self.transaction_id = 0 if not self.access_token \
+            else self.store.get('transaction_id', 0)
+
     def send(self, body, title='', notify_type=NotifyType.INFO, **kwargs):
         """
         Perform Matrix Notification
@@ -695,6 +717,9 @@ class NotifyMatrix(NotifyBase):
             # recognized as retransmissions and ignored
             if self.version == MatrixVersion.V3:
                 self.transaction_id += 1
+                self.store.set(
+                    'transaction_id', self.transaction_id,
+                    expires=self.default_cache_expiry_sec)
 
             if not postokay:
                 # Notify our user
@@ -811,7 +836,18 @@ class NotifyMatrix(NotifyBase):
         self.home_server = response.get('home_server')
         self.user_id = response.get('user_id')
 
+        self.store.set(
+            'access_token', self.access_token,
+            expires=self.default_cache_expiry_sec)
+        self.store.set(
+            'home_server', self.home_server,
+            expires=self.default_cache_expiry_sec)
+        self.store.set(
+            'user_id', self.user_id,
+            expires=self.default_cache_expiry_sec)
+
         if self.access_token is not None:
+            # Store our token into our store
             self.logger.debug(
                 'Registered successfully with Matrix server.')
             return True
@@ -870,6 +906,18 @@ class NotifyMatrix(NotifyBase):
 
         self.logger.debug(
             'Authenticated successfully with Matrix server.')
+
+        # Store our token into our store
+        self.store.set(
+            'access_token', self.access_token,
+            expires=self.default_cache_expiry_sec)
+        self.store.set(
+            'home_server', self.home_server,
+            expires=self.default_cache_expiry_sec)
+        self.store.set(
+            'user_id', self.user_id,
+            expires=self.default_cache_expiry_sec)
+
         return True
 
     def _logout(self):
@@ -907,8 +955,9 @@ class NotifyMatrix(NotifyBase):
         self.home_server = None
         self.user_id = None
 
-        # Clear our room cache
-        self._room_cache = {}
+        # clear our tokens
+        self.store.clear(
+            'access_token', 'home_server', 'user_id', 'transaction_id')
 
         self.logger.debug(
             'Unauthenticated successfully with Matrix server.')
@@ -948,9 +997,13 @@ class NotifyMatrix(NotifyBase):
             )
 
             # Check our cache for speed:
-            if room_id in self._room_cache:
+            try:
                 # We're done as we've already joined the channel
-                return self._room_cache[room_id]['id']
+                return self.store[room_id]['id']
+
+            except KeyError:
+                # No worries, we'll try to acquire the info
+                pass
 
             # Build our URL
             path = '/join/{}'.format(NotifyMatrix.quote(room_id))
@@ -959,10 +1012,10 @@ class NotifyMatrix(NotifyBase):
             postokay, _ = self._fetch(path, payload=payload)
             if postokay:
                 # Cache our entry for fast access later
-                self._room_cache[room_id] = {
+                self.store.set(room_id, {
                     'id': room_id,
                     'home_server': home_server,
-                }
+                })
 
             return room_id if postokay else None
 
@@ -984,9 +1037,13 @@ class NotifyMatrix(NotifyBase):
         room = '#{}:{}'.format(result.group('room'), home_server)
 
         # Check our cache for speed:
-        if room in self._room_cache:
+        try:
             # We're done as we've already joined the channel
-            return self._room_cache[room]['id']
+            return self.store[room]['id']
+
+        except KeyError:
+            # No worries, we'll try to acquire the info
+            pass
 
         # If we reach here, we need to join the channel
 
@@ -997,11 +1054,12 @@ class NotifyMatrix(NotifyBase):
         postokay, response = self._fetch(path, payload=payload)
         if postokay:
             # Cache our entry for fast access later
-            self._room_cache[room] = {
+            self.store.set(room, {
                 'id': response.get('room_id'),
                 'home_server': home_server,
-            }
-            return self._room_cache[room]['id']
+            })
+
+            return response.get('room_id')
 
         # Try to create the channel
         return self._room_create(room)
@@ -1056,10 +1114,10 @@ class NotifyMatrix(NotifyBase):
             return None
 
         # Cache our entry for fast access later
-        self._room_cache[response.get('room_alias')] = {
+        self.store.set(response.get('room_alias'), {
             'id': response.get('room_id'),
             'home_server': home_server,
-        }
+        })
 
         return response.get('room_id')
 
@@ -1292,6 +1350,11 @@ class NotifyMatrix(NotifyBase):
             # nothing to do
             return
 
+        if self.store.mode != PersistentStoreMode.MEMORY:
+            # We no longer have to log out as we have persistant storage to
+            # re-use our credentials with
+            return
+
         try:
             self._logout()
 
@@ -1336,6 +1399,22 @@ class NotifyMatrix(NotifyBase):
             # the end user if we don't have to.
             pass
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.host if self.mode != MatrixWebhookMode.T2BOT
+            else self.access_token,
+            self.port if self.port else (443 if self.secure else 80),
+            self.user if self.mode != MatrixWebhookMode.T2BOT else None,
+            self.password if self.mode != MatrixWebhookMode.T2BOT else None,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/mattermost.py b/apprise/plugins/mattermost.py
index 685ca947..a9a3ee62 100644
--- a/apprise/plugins/mattermost.py
+++ b/apprise/plugins/mattermost.py
@@ -223,8 +223,8 @@ class NotifyMattermost(NotifyBase):
                 payload['channel'] = channel
 
             url = '{}://{}:{}{}/hooks/{}'.format(
-                self.schema, self.host, self.port, self.fullpath,
-                self.token)
+                self.schema, self.host, self.port,
+                self.fullpath.rstrip('/'), self.token)
 
             self.logger.debug('Mattermost POST URL: %s (cert_verify=%r)' % (
                 url, self.verify_certificate,
@@ -286,6 +286,18 @@ class NotifyMattermost(NotifyBase):
         # Return our overall status
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.token, self.host, self.port, self.fullpath,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/messagebird.py b/apprise/plugins/messagebird.py
index c496d347..053c5930 100644
--- a/apprise/plugins/messagebird.py
+++ b/apprise/plugins/messagebird.py
@@ -291,6 +291,15 @@ class NotifyMessageBird(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.apikey, self.source)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/misskey.py b/apprise/plugins/misskey.py
index 73b8f7c6..21d0250d 100644
--- a/apprise/plugins/misskey.py
+++ b/apprise/plugins/misskey.py
@@ -191,6 +191,18 @@ class NotifyMisskey(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.token, self.host, self.port,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/mqtt.py b/apprise/plugins/mqtt.py
index 1e09cd14..22ab9bfb 100644
--- a/apprise/plugins/mqtt.py
+++ b/apprise/plugins/mqtt.py
@@ -429,6 +429,23 @@ class NotifyMQTT(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host,
+            self.port if self.port else (
+                self.mqtt_secure_port if self.secure
+                else self.mqtt_insecure_port),
+            self.fullpath.rstrip('/'),
+            self.client_id,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/msg91.py b/apprise/plugins/msg91.py
index 28a5bf18..cf5fea9b 100644
--- a/apprise/plugins/msg91.py
+++ b/apprise/plugins/msg91.py
@@ -310,6 +310,15 @@ class NotifyMSG91(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.template, self.authkey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/msteams.py b/apprise/plugins/msteams.py
index 1e1925f6..83f85c79 100644
--- a/apprise/plugins/msteams.py
+++ b/apprise/plugins/msteams.py
@@ -468,6 +468,19 @@ class NotifyMSTeams(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol,
+            self.team if self.version > 1 else None,
+            self.token_a, self.token_b, self.token_c,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/nextcloud.py b/apprise/plugins/nextcloud.py
index 9acfc43d..7afe2d9a 100644
--- a/apprise/plugins/nextcloud.py
+++ b/apprise/plugins/nextcloud.py
@@ -278,6 +278,18 @@ class NotifyNextcloud(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host, self.port,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/nextcloudtalk.py b/apprise/plugins/nextcloudtalk.py
index b1b01477..7ba953c3 100644
--- a/apprise/plugins/nextcloudtalk.py
+++ b/apprise/plugins/nextcloudtalk.py
@@ -253,6 +253,18 @@ class NotifyNextcloudTalk(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host, self.port,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/notica.py b/apprise/plugins/notica.py
index 661fde1d..9c5778af 100644
--- a/apprise/plugins/notica.py
+++ b/apprise/plugins/notica.py
@@ -278,6 +278,19 @@ class NotifyNotica(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.mode, self.token, self.user, self.password, self.host,
+            self.port,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/notifiarr.py b/apprise/plugins/notifiarr.py
index b455e58a..dcb940a2 100644
--- a/apprise/plugins/notifiarr.py
+++ b/apprise/plugins/notifiarr.py
@@ -199,6 +199,18 @@ class NotifyNotifiarr(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.apikey,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/notifico.py b/apprise/plugins/notifico.py
index 5cb0d666..db88bf6b 100644
--- a/apprise/plugins/notifico.py
+++ b/apprise/plugins/notifico.py
@@ -197,6 +197,15 @@ class NotifyNotifico(NotifyBase):
         )
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.project_id, self.msghook)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/ntfy.py b/apprise/plugins/ntfy.py
index 4814c9aa..76ee2118 100644
--- a/apprise/plugins/ntfy.py
+++ b/apprise/plugins/ntfy.py
@@ -656,6 +656,34 @@ class NotifyNtfy(NotifyBase):
 
         return False, response
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+
+        kwargs = [
+            self.secure_protocol if self.mode == NtfyMode.CLOUD else (
+                self.secure_protocol if self.secure else self.protocol),
+            self.host if self.mode == NtfyMode.PRIVATE else '',
+            443 if self.mode == NtfyMode.CLOUD else (
+                self.port if self.port else (443 if self.secure else 80)),
+        ]
+
+        if self.mode == NtfyMode.PRIVATE:
+            if self.auth == NtfyAuth.BASIC:
+                kwargs.extend([
+                    self.user if self.user else None,
+                    self.password if self.password else None,
+                ])
+
+            elif self.token:  # NtfyAuth.TOKEN also
+                kwargs.append(self.token)
+
+        return kwargs
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/office365.py b/apprise/plugins/office365.py
index b04f7a03..21a1d6fa 100644
--- a/apprise/plugins/office365.py
+++ b/apprise/plugins/office365.py
@@ -558,6 +558,18 @@ class NotifyOffice365(NotifyBase):
 
         return (True, content)
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.email, self.tenant, self.client_id,
+            self.secret,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/one_signal.py b/apprise/plugins/one_signal.py
index c8340cd2..0bac77ee 100644
--- a/apprise/plugins/one_signal.py
+++ b/apprise/plugins/one_signal.py
@@ -474,6 +474,17 @@ class NotifyOneSignal(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.template_id, self.app, self.apikey,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/opsgenie.py b/apprise/plugins/opsgenie.py
index 5327ec80..228ba755 100644
--- a/apprise/plugins/opsgenie.py
+++ b/apprise/plugins/opsgenie.py
@@ -47,10 +47,12 @@
 # API Integration Docs: https://docs.opsgenie.com/docs/api-integration
 
 import requests
-from json import dumps
+from json import dumps, loads
+import hashlib
 
 from .base import NotifyBase
-from ..common import NotifyType
+from ..common import NotifyType, NOTIFY_TYPES
+from ..common import PersistentStoreMode
 from ..utils import validate_regex
 from ..utils import is_uuid
 from ..utils import parse_list
@@ -76,6 +78,47 @@ OPSGENIE_CATEGORIES = (
 )
 
 
+class OpsgenieAlertAction:
+    """
+    Defines the supported actions
+    """
+    # Use mapping (specify :key=arg to over-ride)
+    MAP = 'map'
+
+    # Create new alert (default)
+    NEW = 'new'
+
+    # Close Alert
+    CLOSE = 'close'
+
+    # Delete Alert
+    DELETE = 'delete'
+
+    # Acknowledge Alert
+    ACKNOWLEDGE = 'acknowledge'
+
+    # Add note to alert
+    NOTE = 'note'
+
+
+OPSGENIE_ACTIONS = (
+    OpsgenieAlertAction.MAP,
+    OpsgenieAlertAction.NEW,
+    OpsgenieAlertAction.CLOSE,
+    OpsgenieAlertAction.DELETE,
+    OpsgenieAlertAction.ACKNOWLEDGE,
+    OpsgenieAlertAction.NOTE,
+)
+
+# Map all support Apprise Categories to Opsgenie Categories
+OPSGENIE_ALERT_MAP = {
+    NotifyType.INFO: OpsgenieAlertAction.CLOSE,
+    NotifyType.SUCCESS: OpsgenieAlertAction.CLOSE,
+    NotifyType.WARNING: OpsgenieAlertAction.NEW,
+    NotifyType.FAILURE: OpsgenieAlertAction.NEW,
+}
+
+
 # Regions
 class OpsgenieRegion:
     US = 'us'
@@ -160,6 +203,10 @@ class NotifyOpsgenie(NotifyBase):
     # The maximum length of the body
     body_maxlen = 15000
 
+    # Our default is to no not use persistent storage beyond in-memory
+    # reference
+    storage_mode = PersistentStoreMode.AUTO
+
     # If we don't have the specified min length, then we don't bother using
     # the body directive
     opsgenie_body_minlen = 130
@@ -170,10 +217,24 @@ class NotifyOpsgenie(NotifyBase):
     # The maximum allowable targets within a notification
     default_batch_size = 50
 
+    # Defines our default message mapping
+    opsgenie_message_map = {
+        # Add a note to existing alert
+        NotifyType.INFO: OpsgenieAlertAction.NOTE,
+        # Close existing alert
+        NotifyType.SUCCESS: OpsgenieAlertAction.CLOSE,
+        # Create notice
+        NotifyType.WARNING: OpsgenieAlertAction.NEW,
+        # Create notice
+        NotifyType.FAILURE: OpsgenieAlertAction.NEW,
+    }
+
     # Define object templates
     templates = (
         '{schema}://{apikey}',
+        '{schema}://{user}@{apikey}',
         '{schema}://{apikey}/{targets}',
+        '{schema}://{user}@{apikey}/{targets}',
     )
 
     # Define our template tokens
@@ -184,6 +245,10 @@ class NotifyOpsgenie(NotifyBase):
             'private': True,
             'required': True,
         },
+        'user': {
+            'name': _('Username'),
+            'type': 'string',
+        },
         'target_escalation': {
             'name': _('Target Escalation'),
             'prefix': '^',
@@ -249,6 +314,12 @@ class NotifyOpsgenie(NotifyBase):
         'to': {
             'alias_of': 'targets',
         },
+        'action': {
+            'name': _('Action'),
+            'type': 'choice:string',
+            'values': OPSGENIE_ACTIONS,
+            'default': OPSGENIE_ACTIONS[0],
+        }
     })
 
     # Map of key-value pairs to use as custom properties of the alert.
@@ -257,11 +328,15 @@ class NotifyOpsgenie(NotifyBase):
             'name': _('Details'),
             'prefix': '+',
         },
+        'mapping': {
+            'name': _('Action Mapping'),
+            'prefix': ':',
+        },
     }
 
     def __init__(self, apikey, targets, region_name=None, details=None,
                  priority=None, alias=None, entity=None, batch=False,
-                 tags=None, **kwargs):
+                 tags=None, action=None, mapping=None, **kwargs):
         """
         Initialize Opsgenie Object
         """
@@ -298,6 +373,41 @@ class NotifyOpsgenie(NotifyBase):
             self.logger.warning(msg)
             raise TypeError(msg)
 
+        if action and isinstance(action, str):
+            self.action = next(
+                (a for a in OPSGENIE_ACTIONS if a.startswith(action)), None)
+            if self.action not in OPSGENIE_ACTIONS:
+                msg = 'The Opsgenie action specified ({}) is invalid.'\
+                    .format(action)
+                self.logger.warning(msg)
+                raise TypeError(msg)
+        else:
+            self.action = self.template_args['action']['default']
+
+        # Store our mappings
+        self.mapping = self.opsgenie_message_map.copy()
+        if mapping and isinstance(mapping, dict):
+            for _k, _v in mapping.items():
+                # Get our mapping
+                k = next((t for t in NOTIFY_TYPES if t.startswith(_k)), None)
+                if not k:
+                    msg = 'The Opsgenie mapping key specified ({}) ' \
+                        'is invalid.'.format(_k)
+                    self.logger.warning(msg)
+                    raise TypeError(msg)
+
+                _v_lower = _v.lower()
+                v = next((v for v in OPSGENIE_ACTIONS[1:]
+                          if v.startswith(_v_lower)), None)
+                if not v:
+                    msg = 'The Opsgenie mapping value (assigned to {}) ' \
+                          'specified ({}) is invalid.'.format(k, _v)
+                    self.logger.warning(msg)
+                    raise TypeError(msg)
+
+                # Update our mapping
+                self.mapping[k] = v
+
         self.details = {}
         if details:
             # Store our extra details
@@ -367,115 +477,234 @@ class NotifyOpsgenie(NotifyBase):
                     if is_uuid(target) else
                     {'type': OpsgenieCategory.USER, 'username': target})
 
-    def send(self, body, title='', notify_type=NotifyType.INFO, **kwargs):
+    def _fetch(self, method, url, payload, params=None):
         """
-        Perform Opsgenie Notification
+        Performs server retrieval/update and returns JSON Response
         """
-
         headers = {
             'User-Agent': self.app_id,
             'Content-Type': 'application/json',
             'Authorization': 'GenieKey {}'.format(self.apikey),
         }
 
+        # Some Debug Logging
+        self.logger.debug(
+            'Opsgenie POST URL: {} (cert_verify={})'.format(
+                url, self.verify_certificate))
+        self.logger.debug('Opsgenie Payload: {}' .format(payload))
+
+        # Initialize our response object
+        content = {}
+
+        # Always call throttle before any remote server i/o is made
+        self.throttle()
+        try:
+            r = method(
+                url,
+                data=dumps(payload),
+                params=params,
+                headers=headers,
+                verify=self.verify_certificate,
+                timeout=self.request_timeout,
+            )
+
+            # A Response might look like:
+            # {
+            #     "result": "Request will be processed",
+            #     "took": 0.302,
+            #     "requestId": "43a29c5c-3dbf-4fa4-9c26-f4f71023e120"
+            # }
+
+            try:
+                # Update our response object
+                content = loads(r.content)
+
+            except (AttributeError, TypeError, ValueError):
+                # ValueError = r.content is Unparsable
+                # TypeError = r.content is None
+                # AttributeError = r is None
+                content = {}
+
+            if r.status_code not in (
+                    requests.codes.accepted, requests.codes.ok):
+                status_str = \
+                    NotifyBase.http_response_code_lookup(
+                        r.status_code)
+
+                self.logger.warning(
+                    'Failed to send Opsgenie notification:'
+                    '{}{}error={}.'.format(
+                        status_str,
+                        ', ' if status_str else '',
+                        r.status_code))
+
+                self.logger.debug(
+                    'Response Details:\r\n{}'.format(r.content))
+
+                return (False, content.get('requestId'))
+
+            # If we reach here; the message was sent
+            self.logger.info('Sent Opsgenie notification')
+            self.logger.debug(
+                'Response Details:\r\n{}'.format(r.content))
+
+            return (True, content.get('requestId'))
+
+        except requests.RequestException as e:
+            self.logger.warning(
+                'A Connection error occurred sending Opsgenie '
+                'notification.')
+            self.logger.debug('Socket Exception: %s' % str(e))
+
+        return (False, content.get('requestId'))
+
+    def send(self, body, title='', notify_type=NotifyType.INFO, **kwargs):
+        """
+        Perform Opsgenie Notification
+        """
+
+        # Get our Opsgenie Action
+        action = OPSGENIE_ALERT_MAP[notify_type] \
+            if self.action == OpsgenieAlertAction.MAP else self.action
+
         # Prepare our URL as it's based on our hostname
         notify_url = OPSGENIE_API_LOOKUP[self.region_name]
 
         # Initialize our has_error flag
         has_error = False
 
-        # Use body if title not set
-        title_body = body if not title else title
+        # Default method is to post
+        method = requests.post
 
-        # Create a copy ouf our details object
-        details = self.details.copy()
-        if 'type' not in details:
-            details['type'] = notify_type
+        # For indexing in persistent store
+        key = hashlib.sha1(
+            (self.entity if self.entity else (
+                self.alias if self.alias else (
+                    title if title else self.app_id)))
+            .encode('utf-8')).hexdigest()[0:10]
 
-        # Prepare our payload
-        payload = {
-            'source': self.app_desc,
-            'message': title_body,
-            'description': body,
-            'details': details,
-            'priority': 'P{}'.format(self.priority),
-        }
+        # Get our Opsgenie Request IDs
+        request_ids = self.store.get(key, [])
+        if not isinstance(request_ids, list):
+            request_ids = []
 
-        # Use our body directive if we exceed the minimum message
-        # limitation
-        if len(payload['message']) > self.opsgenie_body_minlen:
-            payload['message'] = '{}...'.format(
-                title_body[:self.opsgenie_body_minlen - 3])
+        if action == OpsgenieAlertAction.NEW:
+            # Create a copy ouf our details object
+            details = self.details.copy()
+            if 'type' not in details:
+                details['type'] = notify_type
 
-        if self.__tags:
-            payload['tags'] = self.__tags
+            # Use body if title not set
+            title_body = body if not title else title
 
-        if self.entity:
-            payload['entity'] = self.entity
+            # Prepare our payload
+            payload = {
+                'source': self.app_desc,
+                'message': title_body,
+                'description': body,
+                'details': details,
+                'priority': 'P{}'.format(self.priority),
+            }
 
-        if self.alias:
-            payload['alias'] = self.alias
+            # Use our body directive if we exceed the minimum message
+            # limitation
+            if len(payload['message']) > self.opsgenie_body_minlen:
+                payload['message'] = '{}...'.format(
+                    title_body[:self.opsgenie_body_minlen - 3])
 
-        length = len(self.targets) if self.targets else 1
-        for index in range(0, length, self.batch_size):
-            if self.targets:
-                # If there were no targets identified, then we simply
-                # just iterate once without the responders set
-                payload['responders'] = \
-                    self.targets[index:index + self.batch_size]
+            if self.__tags:
+                payload['tags'] = self.__tags
 
-            # Some Debug Logging
-            self.logger.debug(
-                'Opsgenie POST URL: {} (cert_verify={})'.format(
-                    notify_url, self.verify_certificate))
-            self.logger.debug('Opsgenie Payload: {}' .format(payload))
+            if self.entity:
+                payload['entity'] = self.entity
 
-            # Always call throttle before any remote server i/o is made
-            self.throttle()
-            try:
-                r = requests.post(
-                    notify_url,
-                    data=dumps(payload),
-                    headers=headers,
-                    verify=self.verify_certificate,
-                    timeout=self.request_timeout,
-                )
+            if self.alias:
+                payload['alias'] = self.alias
 
-                if r.status_code not in (
-                        requests.codes.accepted, requests.codes.ok):
-                    status_str = \
-                        NotifyBase.http_response_code_lookup(
-                            r.status_code)
+            if self.user:
+                payload['user'] = self.user
 
-                    self.logger.warning(
-                        'Failed to send Opsgenie notification:'
-                        '{}{}error={}.'.format(
-                            status_str,
-                            ', ' if status_str else '',
-                            r.status_code))
+            # reset our request IDs - we will re-populate them
+            request_ids = []
 
-                    self.logger.debug(
-                        'Response Details:\r\n{}'.format(r.content))
+            length = len(self.targets) if self.targets else 1
+            for index in range(0, length, self.batch_size):
+                if self.targets:
+                    # If there were no targets identified, then we simply
+                    # just iterate once without the responders set
+                    payload['responders'] = \
+                        self.targets[index:index + self.batch_size]
 
-                    # Mark our failure
+                # Perform our post
+                success, request_id = self._fetch(
+                    method, notify_url, payload)
+
+                if success and request_id:
+                    # Save our response
+                    request_ids.append(request_id)
+
+                else:
                     has_error = True
-                    continue
 
-                # If we reach here; the message was sent
-                self.logger.info('Sent Opsgenie notification')
-                self.logger.debug(
-                    'Response Details:\r\n{}'.format(r.content))
+            # Store our entries for a maximum of 60 days
+            self.store.set(key, request_ids, expires=60 * 60 * 24 * 60)
 
-            except requests.RequestException as e:
-                self.logger.warning(
-                    'A Connection error occurred sending Opsgenie '
-                    'notification.')
-                self.logger.debug('Socket Exception: %s' % str(e))
-                # Mark our failure
-                has_error = True
+        elif request_ids:
+            # Prepare our payload
+            payload = {
+                'source': self.app_desc,
+                'note': body,
+            }
+
+            if self.user:
+                payload['user'] = self.user
+
+            # Prepare our Identifier type
+            params = {
+                'identifierType': 'id',
+            }
+
+            for request_id in request_ids:
+                if action == OpsgenieAlertAction.DELETE:
+                    # Update our URL
+                    url = f'{notify_url}/{request_id}'
+                    method = requests.delete
+
+                elif action == OpsgenieAlertAction.ACKNOWLEDGE:
+                    url = f'{notify_url}/{request_id}/acknowledge'
+
+                elif action == OpsgenieAlertAction.CLOSE:
+                    url = f'{notify_url}/{request_id}/close'
+
+                else:  # action == OpsgenieAlertAction.CLOSE:
+                    url = f'{notify_url}/{request_id}/notes'
+
+                # Perform our post
+                success, _ = self._fetch(method, url, payload, params)
+
+                if not success:
+                    has_error = True
+
+            if not has_error and action == OpsgenieAlertAction.DELETE:
+                # Remove cached entry
+                self.store.clear(key)
+
+        else:
+            self.logger.info(
+                'No Opsgenie notification sent due to (nothing to %s) '
+                'condition', self.action)
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.region_name, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
@@ -483,6 +712,7 @@ class NotifyOpsgenie(NotifyBase):
 
         # Define any URL parameters
         params = {
+            'action': self.action,
             'region': self.region_name,
             'priority':
                 OPSGENIE_PRIORITIES[self.template_args['priority']['default']]
@@ -506,6 +736,10 @@ class NotifyOpsgenie(NotifyBase):
         # Append our details into our parameters
         params.update({'+{}'.format(k): v for k, v in self.details.items()})
 
+        # Append our assignment extra's into our parameters
+        params.update(
+            {':{}'.format(k): v for k, v in self.mapping.items()})
+
         # Extend our parameters
         params.update(self.url_parameters(privacy=privacy, *args, **kwargs))
 
@@ -522,8 +756,9 @@ class NotifyOpsgenie(NotifyBase):
                 NotifyOpsgenie.template_tokens['target_team']['prefix'],
         }
 
-        return '{schema}://{apikey}/{targets}/?{params}'.format(
+        return '{schema}://{user}{apikey}/{targets}/?{params}'.format(
             schema=self.secure_protocol,
+            user='{}@'.format(self.user) if self.user else '',
             apikey=self.pprint(self.apikey, privacy, safe=''),
             targets='/'.join(
                 [NotifyOpsgenie.quote('{}{}'.format(
@@ -608,4 +843,14 @@ class NotifyOpsgenie(NotifyBase):
         if 'to' in results['qsd'] and len(results['qsd']['to']):
             results['targets'].append(results['qsd']['to'])
 
+        # Store our action (if defined)
+        if 'action' in results['qsd'] and len(results['qsd']['action']):
+            results['action'] = \
+                NotifyOpsgenie.unquote(results['qsd']['action'])
+
+        # store any custom mapping defined
+        results['mapping'] = \
+            {NotifyOpsgenie.unquote(x): NotifyOpsgenie.unquote(y)
+             for x, y in results['qsd:'].items()}
+
         return results
diff --git a/apprise/plugins/pagerduty.py b/apprise/plugins/pagerduty.py
index c9d55552..51bd3888 100644
--- a/apprise/plugins/pagerduty.py
+++ b/apprise/plugins/pagerduty.py
@@ -412,6 +412,18 @@ class NotifyPagerDuty(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.integration_key, self.apikey,
+            self.source,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/pagertree.py b/apprise/plugins/pagertree.py
index 8a041a35..7d82f677 100644
--- a/apprise/plugins/pagertree.py
+++ b/apprise/plugins/pagertree.py
@@ -299,6 +299,15 @@ class NotifyPagerTree(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.integration)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/parseplatform.py b/apprise/plugins/parseplatform.py
index cd59d057..808c3099 100644
--- a/apprise/plugins/parseplatform.py
+++ b/apprise/plugins/parseplatform.py
@@ -257,6 +257,19 @@ class NotifyParsePlatform(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.application_id, self.master_key, self.host, self.port,
+            self.fullpath,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/popcorn_notify.py b/apprise/plugins/popcorn_notify.py
index 388aa219..16d34ab0 100644
--- a/apprise/plugins/popcorn_notify.py
+++ b/apprise/plugins/popcorn_notify.py
@@ -242,6 +242,15 @@ class NotifyPopcornNotify(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/prowl.py b/apprise/plugins/prowl.py
index c174615c..55c4c6be 100644
--- a/apprise/plugins/prowl.py
+++ b/apprise/plugins/prowl.py
@@ -250,6 +250,15 @@ class NotifyProwl(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.apikey, self.providerkey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/pushbullet.py b/apprise/plugins/pushbullet.py
index 8e006db1..9f2226f3 100644
--- a/apprise/plugins/pushbullet.py
+++ b/apprise/plugins/pushbullet.py
@@ -386,6 +386,15 @@ class NotifyPushBullet(NotifyBase):
             if files:
                 files['file'][1].close()
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.accesstoken)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/pushdeer.py b/apprise/plugins/pushdeer.py
index fa888b15..226e3f21 100644
--- a/apprise/plugins/pushdeer.py
+++ b/apprise/plugins/pushdeer.py
@@ -180,6 +180,18 @@ class NotifyPushDeer(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.push_key, self.host, self.port,
+        )
+
     def url(self, privacy=False):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/pushed.py b/apprise/plugins/pushed.py
index 1ed83b9e..c0727ad1 100644
--- a/apprise/plugins/pushed.py
+++ b/apprise/plugins/pushed.py
@@ -303,6 +303,15 @@ class NotifyPushed(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.app_key, self.app_secret)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/pushjet.py b/apprise/plugins/pushjet.py
index f8dcfdf3..bc368b10 100644
--- a/apprise/plugins/pushjet.py
+++ b/apprise/plugins/pushjet.py
@@ -117,6 +117,18 @@ class NotifyPushjet(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host, self.port, self.secret_key,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/pushme.py b/apprise/plugins/pushme.py
index abbed794..54d4032d 100644
--- a/apprise/plugins/pushme.py
+++ b/apprise/plugins/pushme.py
@@ -171,6 +171,15 @@ class NotifyPushMe(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/pushover.py b/apprise/plugins/pushover.py
index 954e7dd0..07881884 100644
--- a/apprise/plugins/pushover.py
+++ b/apprise/plugins/pushover.py
@@ -549,6 +549,15 @@ class NotifyPushover(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.user_key, self.token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/pushsafer.py b/apprise/plugins/pushsafer.py
index 7bdca7a6..7d4052c0 100644
--- a/apprise/plugins/pushsafer.py
+++ b/apprise/plugins/pushsafer.py
@@ -756,6 +756,18 @@ class NotifyPushSafer(NotifyBase):
 
             return False, response
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.privatekey,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/pushy.py b/apprise/plugins/pushy.py
index bb2a24ec..d0995df3 100644
--- a/apprise/plugins/pushy.py
+++ b/apprise/plugins/pushy.py
@@ -310,6 +310,15 @@ class NotifyPushy(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/reddit.py b/apprise/plugins/reddit.py
index 3a60b5e0..1c261be9 100644
--- a/apprise/plugins/reddit.py
+++ b/apprise/plugins/reddit.py
@@ -324,6 +324,18 @@ class NotifyReddit(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.client_id, self.client_secret,
+            self.user, self.password,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/revolt.py b/apprise/plugins/revolt.py
index 1f518540..2edbed33 100644
--- a/apprise/plugins/revolt.py
+++ b/apprise/plugins/revolt.py
@@ -354,6 +354,15 @@ class NotifyRevolt(NotifyBase):
 
         return (True, content)
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.bot_token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/rocketchat.py b/apprise/plugins/rocketchat.py
index 973651e3..7850c3d0 100644
--- a/apprise/plugins/rocketchat.py
+++ b/apprise/plugins/rocketchat.py
@@ -319,6 +319,23 @@ class NotifyRocketChat(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.host,
+            self.port if self.port else (443 if self.secure else 80),
+            self.user,
+            self.password if self.mode in (
+                RocketChatAuthMode.BASIC, RocketChatAuthMode.TOKEN)
+            else self.webhook,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/rsyslog.py b/apprise/plugins/rsyslog.py
index 9631c72f..195a2f3c 100644
--- a/apprise/plugins/rsyslog.py
+++ b/apprise/plugins/rsyslog.py
@@ -300,6 +300,19 @@ class NotifyRSyslog(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.protocol, self.host,
+            self.port if self.port
+            else self.template_tokens['port']['default'],
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/ryver.py b/apprise/plugins/ryver.py
index 114dc6a0..c792c3d7 100644
--- a/apprise/plugins/ryver.py
+++ b/apprise/plugins/ryver.py
@@ -272,6 +272,15 @@ class NotifyRyver(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.organization, self.token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/sendgrid.py b/apprise/plugins/sendgrid.py
index d50839f1..56a99a57 100644
--- a/apprise/plugins/sendgrid.py
+++ b/apprise/plugins/sendgrid.py
@@ -243,6 +243,15 @@ class NotifySendGrid(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.apikey, self.from_email)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/serverchan.py b/apprise/plugins/serverchan.py
index 667d2e95..db157141 100644
--- a/apprise/plugins/serverchan.py
+++ b/apprise/plugins/serverchan.py
@@ -149,6 +149,15 @@ class NotifyServerChan(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.token)
+
     def url(self, privacy=False):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/ses.py b/apprise/plugins/ses.py
index 5a2c047a..5fe4a369 100644
--- a/apprise/plugins/ses.py
+++ b/apprise/plugins/ses.py
@@ -770,6 +770,18 @@ class NotifySES(NotifyBase):
 
         return response
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.from_addr, self.aws_access_key_id,
+            self.aws_secret_access_key, self.aws_region_name,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/sfr.py b/apprise/plugins/sfr.py
index c41e0752..27ea0fb0 100644
--- a/apprise/plugins/sfr.py
+++ b/apprise/plugins/sfr.py
@@ -356,6 +356,18 @@ class NotifySFR(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.space_id,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/signal_api.py b/apprise/plugins/signal_api.py
index 7e557133..5795e0cf 100644
--- a/apprise/plugins/signal_api.py
+++ b/apprise/plugins/signal_api.py
@@ -372,6 +372,18 @@ class NotifySignalAPI(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host, self.port, self.source,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/simplepush.py b/apprise/plugins/simplepush.py
index 10b01b0f..023fcf9d 100644
--- a/apprise/plugins/simplepush.py
+++ b/apprise/plugins/simplepush.py
@@ -283,6 +283,15 @@ class NotifySimplePush(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.user, self.password, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/sinch.py b/apprise/plugins/sinch.py
index 06bd5b1e..51cdea32 100644
--- a/apprise/plugins/sinch.py
+++ b/apprise/plugins/sinch.py
@@ -381,6 +381,18 @@ class NotifySinch(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.service_plan_id, self.api_token, self.source,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/slack.py b/apprise/plugins/slack.py
index bf69d564..fb4a8b6e 100644
--- a/apprise/plugins/slack.py
+++ b/apprise/plugins/slack.py
@@ -326,6 +326,7 @@ class NotifySlack(NotifyBase):
         self.mode = SlackMode.BOT if access_token else SlackMode.WEBHOOK
 
         if self.mode is SlackMode.WEBHOOK:
+            self.access_token = None
             self.token_a = validate_regex(
                 token_a, *self.template_tokens['token_a']['regex'])
             if not self.token_a:
@@ -350,6 +351,9 @@ class NotifySlack(NotifyBase):
                 self.logger.warning(msg)
                 raise TypeError(msg)
         else:
+            self.token_a = None
+            self.token_b = None
+            self.token_c = None
             self.access_token = validate_regex(
                 access_token, *self.template_tokens['access_token']['regex'])
             if not self.access_token:
@@ -1018,6 +1022,18 @@ class NotifySlack(NotifyBase):
         # Return the response for processing
         return response
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.token_a, self.token_b, self.token_c,
+            self.access_token,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/smseagle.py b/apprise/plugins/smseagle.py
index 8eddca58..81308345 100644
--- a/apprise/plugins/smseagle.py
+++ b/apprise/plugins/smseagle.py
@@ -564,6 +564,18 @@ class NotifySMSEagle(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.token, self.host, self.port,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/smsmanager.py b/apprise/plugins/smsmanager.py
index 1d352daf..a4552ad4 100644
--- a/apprise/plugins/smsmanager.py
+++ b/apprise/plugins/smsmanager.py
@@ -314,6 +314,15 @@ class NotifySMSManager(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol[0], self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/smtp2go.py b/apprise/plugins/smtp2go.py
index 017da811..cb8c71ff 100644
--- a/apprise/plugins/smtp2go.py
+++ b/apprise/plugins/smtp2go.py
@@ -464,6 +464,15 @@ class NotifySMTP2Go(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.user, self.host, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/sns.py b/apprise/plugins/sns.py
index cc6e8307..9deb9f7b 100644
--- a/apprise/plugins/sns.py
+++ b/apprise/plugins/sns.py
@@ -573,6 +573,18 @@ class NotifySNS(NotifyBase):
 
         return response
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.aws_access_key_id,
+            self.aws_secret_access_key, self.aws_region_name,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/sparkpost.py b/apprise/plugins/sparkpost.py
index b873d6b0..b1fb7bca 100644
--- a/apprise/plugins/sparkpost.py
+++ b/apprise/plugins/sparkpost.py
@@ -670,6 +670,15 @@ class NotifySparkPost(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.user, self.apikey, self.host)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/splunk.py b/apprise/plugins/splunk.py
index 3a4f1073..3c39e7a4 100644
--- a/apprise/plugins/splunk.py
+++ b/apprise/plugins/splunk.py
@@ -244,7 +244,7 @@ class NotifySplunk(NotifyBase):
             # Assign what was defined:
             self.entity_id = entity_id.strip(' \r\n\t\v/')
 
-        if isinstance(action, str) and action:
+        if action and isinstance(action, str):
             self.action = next(
                 (a for a in SPLUNK_ACTIONS if a.startswith(action)), None)
             if self.action not in SPLUNK_ACTIONS:
@@ -253,7 +253,7 @@ class NotifySplunk(NotifyBase):
                 self.logger.warning(msg)
                 raise TypeError(msg)
         else:
-            self.action = self.template_args['action']['default'] \
+            self.action = self.template_args['action']['default']
 
         # Store our mappings
         self.mapping = self.splunk_message_map.copy()
@@ -382,6 +382,18 @@ class NotifySplunk(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol[0], self.routing_key, self.entity_id,
+            self.apikey,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/streamlabs.py b/apprise/plugins/streamlabs.py
index d5edb645..ba48c170 100644
--- a/apprise/plugins/streamlabs.py
+++ b/apprise/plugins/streamlabs.py
@@ -376,6 +376,15 @@ class NotifyStreamlabs(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.access_token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/synology.py b/apprise/plugins/synology.py
index ed85f80c..d29440b4 100644
--- a/apprise/plugins/synology.py
+++ b/apprise/plugins/synology.py
@@ -156,6 +156,19 @@ class NotifySynology(NotifyBase):
 
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host, self.port, self.token,
+            self.fullpath.rstrip('/'),
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/syslog.py b/apprise/plugins/syslog.py
index 935111ee..f0be5d4f 100644
--- a/apprise/plugins/syslog.py
+++ b/apprise/plugins/syslog.py
@@ -126,6 +126,10 @@ class NotifySyslog(NotifyBase):
     # A URL that takes you to the setup/help of the specific protocol
     setup_url = 'https://github.com/caronc/apprise/wiki/Notify_syslog'
 
+    # No URL Identifier will be defined for this service as there simply isn't
+    # enough details to uniquely identify one dbus:// from another.
+    url_identifier = False
+
     # Disable throttle rate for Syslog requests since they are normally
     # local anyway
     request_rate_per_sec = 0
diff --git a/apprise/plugins/techuluspush.py b/apprise/plugins/techuluspush.py
index 682bf088..62965014 100644
--- a/apprise/plugins/techuluspush.py
+++ b/apprise/plugins/techuluspush.py
@@ -184,6 +184,15 @@ class NotifyTechulusPush(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.apikey)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/telegram.py b/apprise/plugins/telegram.py
index e20655e3..fba4b660 100644
--- a/apprise/plugins/telegram.py
+++ b/apprise/plugins/telegram.py
@@ -63,6 +63,7 @@ from .base import NotifyBase
 from ..common import NotifyType
 from ..common import NotifyImageSize
 from ..common import NotifyFormat
+from ..common import PersistentStoreMode
 from ..utils import parse_bool
 from ..utils import parse_list
 from ..utils import validate_regex
@@ -164,6 +165,10 @@ class NotifyTelegram(NotifyBase):
     # Telegram is limited to sending a maximum of 100 requests per second.
     request_rate_per_sec = 0.001
 
+    # Our default is to no not use persistent storage beyond in-memory
+    # reference
+    storage_mode = PersistentStoreMode.AUTO
+
     # Define object templates
     templates = (
         '{schema}://{bot_token}',
@@ -715,6 +720,7 @@ class NotifyTelegram(NotifyBase):
                     self.logger.info(
                         'Detected Telegram user %s (userid=%d)' % (_user, _id))
                     # Return our detected userid
+                    self.store.set('bot_owner', _id)
                     return _id
 
         self.logger.warning(
@@ -729,7 +735,7 @@ class NotifyTelegram(NotifyBase):
         """
 
         if len(self.targets) == 0 and self.detect_owner:
-            _id = self.detect_bot_owner()
+            _id = self.store.get('bot_owner') or self.detect_bot_owner()
             if _id:
                 # Permanently store our id in our target list for next time
                 self.targets.append((str(_id), self.topic))
@@ -930,6 +936,15 @@ class NotifyTelegram(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.bot_token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/threema.py b/apprise/plugins/threema.py
index 423c2312..55293a1c 100644
--- a/apprise/plugins/threema.py
+++ b/apprise/plugins/threema.py
@@ -302,6 +302,15 @@ class NotifyThreema(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.user, self.secret)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/twilio.py b/apprise/plugins/twilio.py
index 82569a2d..d8666199 100644
--- a/apprise/plugins/twilio.py
+++ b/apprise/plugins/twilio.py
@@ -413,6 +413,18 @@ class NotifyTwilio(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.account_sid, self.auth_token,
+            self.source,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/twist.py b/apprise/plugins/twist.py
index 62d729f4..66f70f52 100644
--- a/apprise/plugins/twist.py
+++ b/apprise/plugins/twist.py
@@ -229,6 +229,18 @@ class NotifyTwist(NotifyBase):
                     self.default_notification_channel))
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol if self.secure else self.protocol,
+            self.user, self.password, self.host, self.port,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/twitter.py b/apprise/plugins/twitter.py
index 8000a815..369aaac0 100644
--- a/apprise/plugins/twitter.py
+++ b/apprise/plugins/twitter.py
@@ -780,6 +780,18 @@ class NotifyTwitter(NotifyBase):
         """
         return 10000 if self.mode == TwitterMessageMode.DM else 280
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol[0], self.ckey, self.csecret, self.akey,
+            self.asecret,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/voipms.py b/apprise/plugins/voipms.py
index 3a4e6d25..6a5d4d5a 100644
--- a/apprise/plugins/voipms.py
+++ b/apprise/plugins/voipms.py
@@ -306,6 +306,17 @@ class NotifyVoipms(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.email, self.password, self.source,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/vonage.py b/apprise/plugins/vonage.py
index 441a6ba6..3a9b2341 100644
--- a/apprise/plugins/vonage.py
+++ b/apprise/plugins/vonage.py
@@ -307,6 +307,15 @@ class NotifyVonage(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol[0], self.apikey, self.secret)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/webexteams.py b/apprise/plugins/webexteams.py
index bd0bdb57..ccee386b 100644
--- a/apprise/plugins/webexteams.py
+++ b/apprise/plugins/webexteams.py
@@ -207,6 +207,15 @@ class NotifyWebexTeams(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol[0], self.token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/wecombot.py b/apprise/plugins/wecombot.py
index ab634171..03282cd9 100644
--- a/apprise/plugins/wecombot.py
+++ b/apprise/plugins/wecombot.py
@@ -134,6 +134,15 @@ class NotifyWeComBot(NotifyBase):
         )
         return
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.key)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/whatsapp.py b/apprise/plugins/whatsapp.py
index 7120d736..e0322b80 100644
--- a/apprise/plugins/whatsapp.py
+++ b/apprise/plugins/whatsapp.py
@@ -446,6 +446,15 @@ class NotifyWhatsApp(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (self.secure_protocol, self.from_phone_id, self.token)
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/windows.py b/apprise/plugins/windows.py
index 746fcd1d..fd9454b2 100644
--- a/apprise/plugins/windows.py
+++ b/apprise/plugins/windows.py
@@ -87,6 +87,10 @@ class NotifyWindows(NotifyBase):
     # The number of seconds to display the popup for
     default_popup_duration_sec = 12
 
+    # No URL Identifier will be defined for this service as there simply isn't
+    # enough details to uniquely identify one dbus:// from another.
+    url_identifier = False
+
     # Define object templates
     templates = (
         '{schema}://',
diff --git a/apprise/plugins/workflows.py b/apprise/plugins/workflows.py
index 4e9d80e0..e047a9f5 100644
--- a/apprise/plugins/workflows.py
+++ b/apprise/plugins/workflows.py
@@ -432,6 +432,18 @@ class NotifyWorkflows(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol[0], self.host, self.port, self.workflow,
+            self.signature,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/xbmc.py b/apprise/plugins/xbmc.py
index 8006e100..61e7b1e2 100644
--- a/apprise/plugins/xbmc.py
+++ b/apprise/plugins/xbmc.py
@@ -299,6 +299,25 @@ class NotifyXBMC(NotifyBase):
 
         return True
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        default_schema = self.xbmc_protocol if (
+            self.protocol <= self.xbmc_remote_protocol) else self.kodi_protocol
+        if self.secure:
+            # Append 's' to schema
+            default_schema += 's'
+
+        port = self.port if self.port else (
+            443 if self.secure else self.xbmc_default_port)
+        return (
+            default_schema, self.user, self.password, self.host, port,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/plugins/zulip.py b/apprise/plugins/zulip.py
index e829e6f6..34c6d813 100644
--- a/apprise/plugins/zulip.py
+++ b/apprise/plugins/zulip.py
@@ -334,6 +334,18 @@ class NotifyZulip(NotifyBase):
 
         return not has_error
 
+    @property
+    def url_identifier(self):
+        """
+        Returns all of the identifiers that make this URL unique from
+        another simliar one. Targets or end points should never be identified
+        here.
+        """
+        return (
+            self.secure_protocol, self.organization, self.hostname,
+            self.token,
+        )
+
     def url(self, privacy=False, *args, **kwargs):
         """
         Returns the URL built dynamically based on specified arguments.
diff --git a/apprise/url.py b/apprise/url.py
index 76d623be..e3c9c17d 100644
--- a/apprise/url.py
+++ b/apprise/url.py
@@ -26,9 +26,11 @@
 # ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
 # POSSIBILITY OF SUCH DAMAGE.
 
+import sys
 import re
 from .logger import logger
 import time
+import hashlib
 from datetime import datetime
 from xml.sax.saxutils import escape as sax_escape
 
@@ -103,6 +105,16 @@ class URLBase:
     # server to send a response.
     socket_read_timeout = 4.0
 
+    # provide the information required to allow for unique id generation when
+    # calling url_id().  Over-ride this in calling classes. Calling classes
+    # should set this to false if there can be no url_id generated
+    url_identifier = None
+
+    # Tracks the last generated url_id() to prevent regeneration; initializes
+    # to False and is set thereafter.  This is an internal value for this class
+    # only and should not be set to anything other then False below...
+    __cached_url_identifier = False
+
     # Handle
     # Maintain a set of tags to associate with this specific notification
     tags = set()
@@ -185,6 +197,8 @@ class URLBase:
 
     template_kwargs = {}
 
+    # Internal Values
+
     def __init__(self, asset=None, **kwargs):
         """
         Initialize some general logging and common server arguments that will
@@ -197,17 +211,17 @@ class URLBase:
             asset if isinstance(asset, AppriseAsset) else AppriseAsset()
 
         # Certificate Verification (for SSL calls); default to being enabled
-        self.verify_certificate = parse_bool(kwargs.get('verify', True))
+        self.verify_certificate = parse_bool(
+            kwargs.get('verify', URLBase.verify_certificate))
+
+        # Schema
+        self.schema = kwargs.get('schema', 'unknown').lower()
 
         # Secure Mode
         self.secure = kwargs.get('secure', None)
-        try:
-            if not isinstance(self.secure, bool):
-                # Attempt to detect
-                self.secure = kwargs.get('schema', '')[-1].lower() == 's'
-
-        except (TypeError, IndexError):
-            self.secure = False
+        if not isinstance(self.secure, bool):
+            # Attempt to detect
+            self.secure = self.schema[-1:] == 's'
 
         self.host = URLBase.unquote(kwargs.get('host'))
         self.port = kwargs.get('port')
@@ -334,7 +348,7 @@ class URLBase:
 
         default_port = 443 if self.secure else 80
 
-        return '{schema}://{auth}{hostname}{port}{fullpath}?{params}'.format(
+        return '{schema}://{auth}{hostname}{port}{fullpath}{params}'.format(
             schema='https' if self.secure else 'http',
             auth=auth,
             # never encode hostname since we're expecting it to be a valid one
@@ -343,9 +357,117 @@ class URLBase:
                  else ':{}'.format(self.port),
             fullpath=URLBase.quote(self.fullpath, safe='/')
             if self.fullpath else '/',
-            params=URLBase.urlencode(params),
+            params=('?' + URLBase.urlencode(params) if params else ''),
         )
 
+    def url_id(self, lazy=True, hash_engine=hashlib.sha256):
+        """
+        Returns a unique URL identifier that representing the Apprise URL
+        itself. The url_id is always a hash string or None if it can't
+        be generated.
+
+        The idea is to only build the ID based on the credentials or specific
+        elements relative to the URL itself. The URL ID should never factor in
+        (or else it's a bug) the following:
+          - any targets defined
+          - all GET parameters options unless they explicitly change the
+            complete function of the code.
+
+             For example: GET parameters like ?image=false&avatar=no should
+             have no bearing in the uniqueness of the Apprise URL Identifier.
+
+             Consider plugins where some get parameters completely change
+             how the entire upstream comunication works such as slack:// and
+             matrix:// which has a mode. In these circumstances, they should
+             be considered in he unique generation.
+
+        The intention of this function is to help align Apprise URLs that are
+        common with one another and therefore can share the same persistent
+        storage even when subtle changes are made to them.
+
+        Hence the following would all return the same URL Identifier:
+             json://abc/def/ghi?image=no
+             json://abc/def/ghi/?test=yes&image=yes
+
+        """
+
+        if lazy and self.__cached_url_identifier is not False:
+            return self.__cached_url_identifier \
+                if not (self.__cached_url_identifier
+                        and self.asset.storage_idlen) \
+                else self.__cached_url_identifier[:self.asset.storage_idlen]
+
+        # Python v3.9 introduces usedforsecurity argument
+        kwargs = {'usedforsecurity': False} \
+            if sys.version_info >= (3, 9) else {}
+
+        if self.url_identifier is False:
+            # Disabled
+            self.__cached_url_identifier = None
+
+        elif self.url_identifier in (None, True):
+
+            # Prepare our object
+            engine = hash_engine(
+                self.asset.storage_salt + self.schema.encode(
+                    self.asset.encoding), **kwargs)
+
+            # We want to treat `None` differently then a blank entry
+            engine.update(
+                b'\0' if self.password is None
+                else self.password.encode(self.asset.encoding))
+            engine.update(
+                b'\0' if self.user is None
+                else self.user.encode(self.asset.encoding))
+            engine.update(
+                b'\0' if not self.host
+                else self.host.encode(self.asset.encoding))
+            engine.update(
+                b'\0' if self.port is None
+                else f'{self.port}'.encode(self.asset.encoding))
+            engine.update(
+                self.fullpath.rstrip('/').encode(self.asset.encoding))
+            engine.update(b's' if self.secure else b'i')
+
+            # Save our generated content
+            self.__cached_url_identifier = engine.hexdigest()
+
+        elif isinstance(self.url_identifier, str):
+            self.__cached_url_identifier = hash_engine(
+                self.asset.storage_salt + self.url_identifier.encode(
+                    self.asset.encoding), **kwargs).hexdigest()
+
+        elif isinstance(self.url_identifier, bytes):
+            self.__cached_url_identifier = hash_engine(
+                self.asset.storage_salt + self.url_identifier,
+                **kwargs).hexdigest()
+
+        elif isinstance(self.url_identifier, (list, tuple, set)):
+            self.__cached_url_identifier = hash_engine(
+                self.asset.storage_salt + b''.join([
+                    (x if isinstance(x, bytes)
+                     else str(x).encode(self.asset.encoding))
+                    for x in self.url_identifier]), **kwargs).hexdigest()
+
+        elif isinstance(self.url_identifier, dict):
+            self.__cached_url_identifier = hash_engine(
+                self.asset.storage_salt + b''.join([
+                    (x if isinstance(x, bytes)
+                     else str(x).encode(self.asset.encoding))
+                    for x in self.url_identifier.values()]),
+                **kwargs).hexdigest()
+
+        else:
+            self.__cached_url_identifier = hash_engine(
+                self.asset.storage_salt + str(
+                    self.url_identifier).encode(self.asset.encoding),
+                **kwargs).hexdigest()
+
+        return self.__cached_url_identifier \
+            if not (self.__cached_url_identifier
+                    and self.asset.storage_idlen) \
+            else self.__cached_url_identifier[:self.asset.storage_idlen]
+
     def __contains__(self, tags):
         """
         Returns true if the tag specified is associated with this notification.
@@ -660,14 +782,22 @@ class URLBase:
         this class.
         """
 
-        return {
-            # The socket read timeout
-            'rto': str(self.socket_read_timeout),
-            # The request/socket connect timeout
-            'cto': str(self.socket_connect_timeout),
-            # Certificate verification
-            'verify': 'yes' if self.verify_certificate else 'no',
-        }
+        # parameters are only provided on demand to keep the URL short
+        params = {}
+
+        # The socket read timeout
+        if self.socket_read_timeout != URLBase.socket_read_timeout:
+            params['rto'] = str(self.socket_read_timeout)
+
+        # The request/socket connect timeout
+        if self.socket_connect_timeout != URLBase.socket_connect_timeout:
+            params['cto'] = str(self.socket_connect_timeout)
+
+        # Certificate verification
+        if self.verify_certificate != URLBase.verify_certificate:
+            params['verify'] = 'yes' if self.verify_certificate else 'no'
+
+        return params
 
     @staticmethod
     def post_process_parse_url_results(results):
diff --git a/apprise/utils.py b/apprise/utils.py
index 2aff569e..b33ad63a 100644
--- a/apprise/utils.py
+++ b/apprise/utils.py
@@ -31,7 +31,9 @@ import sys
 import json
 import contextlib
 import os
+import binascii
 import locale
+import platform
 import typing
 import base64
 from itertools import chain
@@ -47,6 +49,20 @@ from urllib.parse import urlencode as _urlencode
 import importlib.util
 
 
+# A simple path decoder we can re-use which looks after
+# ensuring our file info is expanded correctly when provided
+# a path.
+__PATH_DECODER = os.path.expandvars if \
+    platform.system() == 'Windows' else os.path.expanduser
+
+
+def path_decode(path):
+    """
+    Returns the fully decoded path based on the operating system
+    """
+    return os.path.abspath(__PATH_DECODER(path))
+
+
 def import_module(path, name):
     """
     Load our module based on path
@@ -1604,33 +1620,144 @@ def dict_full_update(dict1, dict2):
     return
 
 
+def dir_size(path, max_depth=3, missing_okay=True, _depth=0, _errors=None):
+    """
+    Scans a provided path an returns it's size (in bytes) of path provided
+    """
+
+    if _errors is None:
+        _errors = set()
+
+    if _depth > max_depth:
+        _errors.add(path)
+        return (0, _errors)
+
+    total = 0
+    try:
+        with os.scandir(path) as it:
+            for entry in it:
+                try:
+                    if entry.is_file(follow_symlinks=False):
+                        total += entry.stat(follow_symlinks=False).st_size
+
+                    elif entry.is_dir(follow_symlinks=False):
+                        (totals, _) = dir_size(
+                            entry.path,
+                            max_depth=max_depth,
+                            _depth=_depth + 1,
+                            _errors=_errors)
+                        total += totals
+
+                except FileNotFoundError:
+                    # no worries; Nothing to do
+                    continue
+
+                except (OSError, IOError) as e:
+                    # Permission error of some kind or disk problem...
+                    # There is nothing we can do at this point
+                    _errors.add(entry.path)
+                    logger.warning(
+                        'dir_size detetcted inaccessible path: %s',
+                        os.fsdecode(entry.path))
+                    logger.debug('dir_size Exception: %s' % str(e))
+                    continue
+
+    except FileNotFoundError:
+        if not missing_okay:
+            # Conditional error situation
+            _errors.add(path)
+
+    except (OSError, IOError) as e:
+        # Permission error of some kind or disk problem...
+        # There is nothing we can do at this point
+        _errors.add(path)
+        logger.warning(
+            'dir_size detetcted inaccessible path: %s',
+            os.fsdecode(path))
+        logger.debug('dir_size Exception: %s' % str(e))
+
+    return (total, _errors)
+
+
+def bytes_to_str(value):
+    """
+    Covert an integer (in bytes) into it's string representation with
+    acompanied unit value (such as B, KB, MB, GB, TB, etc)
+    """
+    unit = 'B'
+    try:
+        value = float(value)
+
+    except (ValueError, TypeError):
+        return None
+
+    if value >= 1024.0:
+        value = value / 1024.0
+        unit = 'KB'
+        if value >= 1024.0:
+            value = value / 1024.0
+            unit = 'MB'
+            if value >= 1024.0:
+                value = value / 1024.0
+                unit = 'GB'
+                if value >= 1024.0:
+                    value = value / 1024.0
+                    unit = 'TB'
+
+    return '%.2f%s' % (round(value, 2), unit)
+
+
 def decode_b64_dict(di: dict) -> dict:
+    """
+    decodes base64 dictionary previously encoded
+
+    string entries prefixed with `b64:` are targeted
+    """
     di = copy.deepcopy(di)
     for k, v in di.items():
         if not isinstance(v, str) or not v.startswith("b64:"):
             continue
+
         try:
             parsed_v = base64.b64decode(v[4:])
             parsed_v = json.loads(parsed_v)
-        except Exception:
+
+        except (ValueError, TypeError, binascii.Error,
+                json.decoder.JSONDecodeError):
+            # ValueError: the length of altchars is not 2.
+            # TypeError: invalid input
+            # binascii.Error: not base64 (bad padding)
+            # json.decoder.JSONDecodeError: Bad JSON object
+
             parsed_v = v
         di[k] = parsed_v
     return di
 
 
-def encode_b64_dict(
-        di: dict
-) -> typing.Tuple[dict, bool]:
+def encode_b64_dict(di: dict, encoding='utf-8') -> typing.Tuple[dict, bool]:
+    """
+    Encodes dictionary entries containing binary types (int, float) into base64
+
+    Final product is always string based values
+    """
     di = copy.deepcopy(di)
     needs_decoding = False
     for k, v in di.items():
         if isinstance(v, str):
             continue
+
         try:
-            encoded = base64.urlsafe_b64encode(json.dumps(v).encode())
-            encoded = "b64:{}".format(encoded.decode())
+            encoded = base64.urlsafe_b64encode(json.dumps(v).encode(encoding))
+            encoded = "b64:{}".format(encoded.decode(encoding))
             needs_decoding = True
-        except Exception:
+
+        except (ValueError, TypeError):
+            # ValueError:
+            #  - the length of altchars is not 2.
+            # TypeError:
+            #  - json not searializable or
+            #  - bytes object not passed into urlsafe_b64encode()
             encoded = str(v)
+
         di[k] = encoded
     return di, needs_decoding
diff --git a/bin/test.sh b/bin/test.sh
index 15fc5473..64d3c245 100755
--- a/bin/test.sh
+++ b/bin/test.sh
@@ -27,8 +27,9 @@
 # ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
 # POSSIBILITY OF SUCH DAMAGE.
 #
-PYTEST=$(which py.test)
-
+PYTEST=$(which py.test 2>/dev/null)
+# Support different distributions
+[ -z "$PYTEST" ] && PYTEST=$(which py.test-3 2>/dev/null)
 # This script can basically be used to test individual tests that have
 # been created. Just run the to run all tests:
 #    ./devel/test.sh
diff --git a/docker-compose.yml b/docker-compose.yml
index a23865e2..bc48e333 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -1,58 +1,58 @@
 version: "3.3"
 services:
+  test.py39:
+    build:
+      context: .
+      dockerfile: test/docker/Dockerfile.py39
+    volumes:
+      - ./:/apprise
+
   test.py310:
     build:
       context: .
-      dockerfile: Dockerfile.py310
+      dockerfile: test/docker/Dockerfile.py310
     volumes:
       - ./:/apprise
 
   test.py311:
     build:
       context: .
-      dockerfile: Dockerfile.py311
+      dockerfile: test/docker/Dockerfile.py311
     volumes:
       - ./:/apprise
 
   test.py312:
     build:
       context: .
-      dockerfile: Dockerfile.py312
-    volumes:
-      - ./:/apprise
-
-  rpmbuild.el8:
-    build:
-      context: .
-      dockerfile: Dockerfile.el8
+      dockerfile: test/docker/Dockerfile.py312
     volumes:
       - ./:/apprise
 
   rpmbuild.el9:
     build:
       context: .
-      dockerfile: Dockerfile.el9
+      dockerfile: test/docker/Dockerfile.el9
     volumes:
       - ./:/apprise
 
   rpmbuild.f37:
     build:
       context: .
-      dockerfile: Dockerfile.f37
+      dockerfile: test/docker/Dockerfile.f37
     volumes:
       - ./:/apprise
 
   rpmbuild.f39:
     build:
       context: .
-      dockerfile: Dockerfile.f39
+      dockerfile: test/docker/Dockerfile.f39
     volumes:
       - ./:/apprise
 
   rpmbuild.rawhide:
     build:
       context: .
-      dockerfile: Dockerfile.rawhide
+      dockerfile: test/docker/Dockerfile.rawhide
     volumes:
       - ./:/apprise
 
diff --git a/setup.py b/setup.py
index 960b5fb9..0e1095b4 100755
--- a/setup.py
+++ b/setup.py
@@ -94,6 +94,7 @@ setup(
         'Natural Language :: English',
         'Programming Language :: Python',
         'Programming Language :: Python :: 3',
+        'Programming Language :: Python :: 3.6',
         'Programming Language :: Python :: 3.7',
         'Programming Language :: Python :: 3.8',
         'Programming Language :: Python :: 3.9',
diff --git a/test/conftest.py b/test/conftest.py
index 34a2d9da..002e2424 100644
--- a/test/conftest.py
+++ b/test/conftest.py
@@ -28,6 +28,7 @@
 
 import sys
 import os
+import gc
 
 import pytest
 import mimetypes
@@ -69,3 +70,14 @@ def no_throttling_everywhere(session_mocker):
 
     for plugin in N_MGR.plugins():
         session_mocker.patch.object(plugin, "request_rate_per_sec", 0)
+
+
+@pytest.fixture(scope="function", autouse=True)
+def collect_all_garbage(session_mocker):
+    """
+    A pytest session fixture to ensure no __del__ cleanup call from
+    one plugin will cause testing issues with another.  Run garbage
+    collection after every test
+    """
+    # Force garbage collection
+    gc.collect()
diff --git a/Dockerfile.el9 b/test/docker/Dockerfile.el9
similarity index 98%
rename from Dockerfile.el9
rename to test/docker/Dockerfile.el9
index 08700ba4..9699b017 100644
--- a/Dockerfile.el9
+++ b/test/docker/Dockerfile.el9
@@ -51,7 +51,7 @@ RUN rpmspec -q --buildrequires /python-apprise.spec | cut -f1 -d' ' | \
     xargs dnf install -y
 
 # RPM Build Structure Setup
-ENV FLAVOR=rpmbuild OS=centos DIST=el8
+ENV FLAVOR=rpmbuild OS=centos DIST=el9
 RUN useradd builder -u 1000 -m -G users,wheel &>/dev/null && \
     echo "builder ALL=(ALL:ALL) NOPASSWD:ALL" >> /etc/sudoers
 
diff --git a/Dockerfile.f37 b/test/docker/Dockerfile.f37
similarity index 100%
rename from Dockerfile.f37
rename to test/docker/Dockerfile.f37
diff --git a/Dockerfile.f39 b/test/docker/Dockerfile.f39
similarity index 100%
rename from Dockerfile.f39
rename to test/docker/Dockerfile.f39
diff --git a/Dockerfile.py310 b/test/docker/Dockerfile.py310
similarity index 100%
rename from Dockerfile.py310
rename to test/docker/Dockerfile.py310
diff --git a/Dockerfile.py311 b/test/docker/Dockerfile.py311
similarity index 100%
rename from Dockerfile.py311
rename to test/docker/Dockerfile.py311
diff --git a/Dockerfile.py312 b/test/docker/Dockerfile.py312
similarity index 100%
rename from Dockerfile.py312
rename to test/docker/Dockerfile.py312
diff --git a/test/docker/Dockerfile.py39 b/test/docker/Dockerfile.py39
new file mode 100644
index 00000000..6c79e411
--- /dev/null
+++ b/test/docker/Dockerfile.py39
@@ -0,0 +1,49 @@
+# -*- coding: utf-8 -*-
+# BSD 2-Clause License
+#
+# Apprise - Push Notification Library.
+# Copyright (c) 2024, Chris Caron <lead2gold@gmail.com>
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are met:
+#
+# 1. Redistributions of source code must retain the above copyright notice,
+#    this list of conditions and the following disclaimer.
+#
+# 2. Redistributions in binary form must reproduce the above copyright notice,
+#    this list of conditions and the following disclaimer in the documentation
+#    and/or other materials provided with the distribution.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
+# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
+# POSSIBILITY OF SUCH DAMAGE.
+
+# Base
+FROM python:3.9-buster
+RUN apt-get update && \
+    apt-get install -y --no-install-recommends libdbus-1-dev libgirepository1.0-dev build-essential musl-dev bash dbus && \
+    rm -rf /var/lib/apt/lists/*
+RUN pip install --no-cache-dir dbus-python "PyGObject==3.44.2"
+
+# Apprise Setup
+VOLUME ["/apprise"]
+WORKDIR /apprise
+COPY requirements.txt /
+COPY dev-requirements.txt /
+ENV PYTHONPATH /apprise
+ENV PYTHONPYCACHEPREFIX /apprise/__pycache__/py39
+
+RUN pip install --no-cache-dir -r /requirements.txt -r /dev-requirements.txt
+
+RUN addgroup --gid ${USER_GID:-1000} apprise
+RUN adduser --system --uid ${USER_UID:-1000} --ingroup apprise --home /apprise --no-create-home --disabled-password apprise
+
+USER apprise
diff --git a/Dockerfile.rawhide b/test/docker/Dockerfile.rawhide
similarity index 100%
rename from Dockerfile.rawhide
rename to test/docker/Dockerfile.rawhide
diff --git a/test/helpers/rest.py b/test/helpers/rest.py
index c3ab5bcf..772618d1 100644
--- a/test/helpers/rest.py
+++ b/test/helpers/rest.py
@@ -37,6 +37,7 @@ from string import ascii_uppercase as str_alpha
 from string import digits as str_num
 
 from apprise import NotifyBase
+from apprise import PersistentStoreMode
 from apprise import NotifyType
 from apprise import Apprise
 from apprise import AppriseAsset
@@ -105,18 +106,18 @@ class AppriseURLTester:
             'meta': meta,
         })
 
-    def run_all(self):
+    def run_all(self, tmpdir=None):
         """
         Run all of our tests
         """
         # iterate over our dictionary and test it out
         for (url, meta) in self.__tests:
-            self.run(url, meta)
+            self.run(url, meta, tmpdir)
 
     @mock.patch('requests.get')
     @mock.patch('requests.post')
     @mock.patch('requests.request')
-    def run(self, url, meta, mock_request, mock_post, mock_get):
+    def run(self, url, meta, tmpdir, mock_request, mock_post, mock_get):
         """
         Run a specific test
         """
@@ -134,16 +135,38 @@ class AppriseURLTester:
         # Our regular expression
         url_matches = meta.get('url_matches')
 
+        # Detect our storage path (used to set persistent storage
+        # mode
+        storage_path = \
+            tmpdir if tmpdir and isinstance(tmpdir, str) and \
+            os.path.isdir(tmpdir) else None
+
+        # Our storage mode to set
+        storage_mode = meta.get(
+            'storage_mode',
+            PersistentStoreMode.MEMORY
+            if not storage_path else PersistentStoreMode.AUTO)
+
+        # Debug Mode
+        pdb = meta.get('pdb', False)
+
         # Whether or not we should include an image with our request; unless
         # otherwise specified, we assume that images are to be included
         include_image = meta.get('include_image', True)
         if include_image:
             # a default asset
-            asset = AppriseAsset()
+            asset = AppriseAsset(
+                storage_mode=storage_mode,
+                storage_path=storage_path,
+            )
 
         else:
             # Disable images
-            asset = AppriseAsset(image_path_mask=False, image_url_mask=False)
+            asset = AppriseAsset(
+                image_path_mask=False, image_url_mask=False,
+                storage_mode=storage_mode,
+                storage_path=storage_path,
+            )
             asset.image_url_logo = None
 
         # Mock our request object
@@ -153,6 +176,12 @@ class AppriseURLTester:
         mock_post.return_value = robj
         mock_request.return_value = robj
 
+        if pdb:
+            # Makes it easier to debug with this peice of code
+            # just add `pdb': True to the call that is failing
+            import pdb
+            pdb.set_trace()
+
         try:
             # We can now instantiate our object:
             obj = Apprise.instantiate(
@@ -201,6 +230,13 @@ class AppriseURLTester:
             # this url
             assert isinstance(obj.url(), str) is True
 
+            # Test that we support a url identifier
+            url_id = obj.url_id()
+
+            # It can be either disabled or a string; nothing else
+            assert isinstance(url_id, str) or \
+                (url_id is None and obj.url_identifier is False)
+
             # Verify we can acquire a target count as an integer
             assert isinstance(len(obj), int)
 
@@ -230,6 +266,20 @@ class AppriseURLTester:
             # from the one that was already created properly
             obj_cmp = Apprise.instantiate(obj.url())
 
+            # Our new object should produce the same url identifier
+            if obj.url_identifier != obj_cmp.url_identifier:
+                print('Provided %s' % url)
+                raise AssertionError(
+                    "URL Identifier: '{}' != expected '{}'".format(
+                        obj_cmp.url_identifier, obj.url_identifier))
+
+            # Back our check up
+            if obj.url_id() != obj_cmp.url_id():
+                print('Provided %s' % url)
+                raise AssertionError(
+                    "URL ID(): '{}' != expected '{}'".format(
+                        obj_cmp.url_id(), obj.url_id()))
+
             # Our object should be the same instance as what we had
             # originally expected above.
             if not isinstance(obj_cmp, NotifyBase):
@@ -251,6 +301,7 @@ class AppriseURLTester:
 
             # Tidy our object
             del obj_cmp
+            del instance
 
         if _self:
             # Iterate over our expected entries inside of our
@@ -557,7 +608,7 @@ class AppriseURLTester:
                     try:
                         assert obj.notify(
                             body=self.body, title=self.title,
-                            notify_type=NotifyType.INFO) is False
+                            notify_type=notify_type) is False
 
                     except AssertionError:
                         # Don't mess with these entries
@@ -603,7 +654,7 @@ class AppriseURLTester:
                     try:
                         assert obj.notify(
                             body=self.body,
-                            notify_type=NotifyType.INFO) is False
+                            notify_type=notify_type) is False
 
                     except AssertionError:
                         # Don't mess with these entries
diff --git a/test/test_api.py b/test/test_api.py
index a7227244..2c7fa4b3 100644
--- a/test/test_api.py
+++ b/test/test_api.py
@@ -234,10 +234,6 @@ def apprise_test(do_notify):
             # We fail whenever we're initialized
             raise TypeError()
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         @staticmethod
         def parse_url(url, *args, **kwargs):
             # always parseable
@@ -248,10 +244,6 @@ def apprise_test(do_notify):
             super().__init__(
                 notify_format=NotifyFormat.HTML, **kwargs)
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def send(self, **kwargs):
             # Pretend everything is okay
             return True
@@ -347,10 +339,6 @@ def apprise_test(do_notify):
             # Pretend everything is okay (async)
             raise TypeError()
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
     class RuntimeNotification(NotifyBase):
         def notify(self, **kwargs):
             # Pretend everything is okay
@@ -360,10 +348,6 @@ def apprise_test(do_notify):
             # Pretend everything is okay (async)
             raise TypeError()
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
     class FailNotification(NotifyBase):
 
         def notify(self, **kwargs):
@@ -374,10 +358,6 @@ def apprise_test(do_notify):
             # Pretend everything is okay (async)
             raise TypeError()
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
     # Store our bad notification in our schema map
     N_MGR['throw'] = ThrowNotification
 
@@ -409,10 +389,6 @@ def apprise_test(do_notify):
             # Pretend everything is okay
             raise TypeError()
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
     N_MGR.unload_modules()
     N_MGR['throw'] = ThrowInstantiateNotification
 
@@ -440,6 +416,30 @@ def apprise_test(do_notify):
     a.clear()
     assert len(a) == 0
 
+    with pytest.raises(ValueError):
+        # Encoding error
+        AppriseAsset(encoding='ascii', storage_salt="γƒœγƒΌγƒ«γƒˆ")
+
+    with pytest.raises(ValueError):
+        # Not a valid storage salt (must be str or bytes)
+        AppriseAsset(storage_salt=42)
+
+    # Set our cache to be off
+    plugin = a.instantiate('good://localhost?store=no', asset=asset)
+    assert isinstance(plugin, NotifyBase)
+    assert plugin.url_id(lazy=False) is None
+    # Verify our cache is disabled
+    assert 'store=no' in plugin.url()
+
+    with pytest.raises(ValueError):
+        # idlen must be greater then 0
+        AppriseAsset(storage_idlen=-1)
+
+    # Create a larger idlen
+    asset = AppriseAsset(storage_idlen=32)
+    plugin = a.instantiate('good://localhost', asset=asset)
+    assert len(plugin.url_id()) == 32
+
     # Instantiate a bad object
     plugin = a.instantiate(object, tag="bad_object")
     assert plugin is None
@@ -527,7 +527,7 @@ def apprise_test(do_notify):
     assert len(a) == 0
 
 
-def test_apprise_pretty_print(tmpdir):
+def test_apprise_pretty_print():
     """
     API: Apprise() Pretty Print tests
 
@@ -724,7 +724,7 @@ def apprise_tagging_test(mock_post, mock_get, do_notify):
         tag=[(object, ), ]) is None
 
 
-def test_apprise_schemas(tmpdir):
+def test_apprise_schemas():
     """
     API: Apprise().schema() tests
 
@@ -830,8 +830,143 @@ def test_apprise_urlbase_object():
     assert base.request_url == 'http://127.0.0.1/path/'
     assert base.url().startswith('http://user@127.0.0.1/path/')
 
+    # Generic initialization
+    base = URLBase(**{'schema': ''})
+    assert base.request_timeout == (4.0, 4.0)
+    assert base.request_auth is None
+    assert base.request_url == 'http:///'
+    assert base.url().startswith('http:///')
 
-def test_apprise_notify_formats(tmpdir):
+    base = URLBase()
+    assert base.request_timeout == (4.0, 4.0)
+    assert base.request_auth is None
+    assert base.request_url == 'http:///'
+    assert base.url().startswith('http:///')
+
+
+def test_apprise_unique_id():
+    """
+    API: Apprise() Input Formats tests
+
+    """
+
+    # Default testing
+    obj1 = Apprise.instantiate('json://user@127.0.0.1/path')
+    obj2 = Apprise.instantiate('json://user@127.0.0.1/path/?arg=')
+
+    assert obj1.url_identifier == obj2.url_identifier
+    assert obj1.url_id() == obj2.url_id()
+    # Second call leverages lazy reference (so it's much faster
+    assert obj1.url_id() == obj2.url_id()
+    # Disable Lazy Setting
+    assert obj1.url_id(lazy=False) == obj2.url_id(lazy=False)
+
+    # A variation such as providing a password or altering the path makes the
+    # url_id() different:
+    obj2 = Apprise.instantiate('json://user@127.0.0.1/path2/?arg=')  # path
+    assert obj1.url_id() != obj2.url_id()
+    obj2 = Apprise.instantiate(
+        'jsons://user@127.0.0.1/path/?arg=')  # secure flag
+    assert obj1.url_id() != obj2.url_id()
+    obj2 = Apprise.instantiate(
+        'json://user2@127.0.0.1/path/?arg=')  # user
+    assert obj1.url_id() != obj2.url_id()
+    obj2 = Apprise.instantiate(
+        'json://user@127.0.0.1:8080/path/?arg=')  # port
+    assert obj1.url_id() != obj2.url_id()
+    obj2 = Apprise.instantiate(
+        'json://user:pass@127.0.0.1/path/?arg=')  # password
+    assert obj1.url_id() != obj2.url_id()
+
+    # Leverage salt setting
+    asset = AppriseAsset(storage_salt='abcd')
+
+    obj2 = Apprise.instantiate('json://user@127.0.0.1/path/', asset=asset)
+    assert obj1.url_id(lazy=False) != obj2.url_id(lazy=False)
+
+    asset = AppriseAsset(storage_salt=b'abcd')
+    # same salt value produces a match again
+    obj1 = Apprise.instantiate('json://user@127.0.0.1/path/', asset=asset)
+    assert obj1.url_id() == obj2.url_id()
+
+    # We'll add a good notification to our list
+    class TesNoURLID(NotifyBase):
+        """
+        This class is just sets a use case where we don't return a
+         url_identifier
+        """
+
+        # we'll use this as a key to make our service easier to find
+        # in the next part of the testing
+        service_name = 'nourl'
+
+        _url_identifier = False
+
+        def send(self, **kwargs):
+            # Pretend everything is okay (so we don't break other tests)
+            return True
+
+        @staticmethod
+        def parse_url(url):
+            return NotifyBase.parse_url(url, verify_host=False)
+
+        @property
+        def url_identifier(self):
+            """
+            No URL Identifier
+            """
+            return self._url_identifier
+
+    N_MGR['nourl'] = TesNoURLID
+
+    # setting URL Identifier to False disables the generator
+    url = 'nourl://'
+    obj = Apprise.instantiate(url)
+    # No generation takes place
+    assert obj.url_id() is None
+
+    #
+    # Dictionary Testing
+    #
+    obj._url_identifier = {
+        'abc': '123', 'def': b'\0', 'hij': 42, 'klm': object}
+    # call uses cached value (from above)
+    assert obj.url_id() is None
+    # Tests dictionary key generation
+    assert obj.url_id(lazy=False) is not None
+
+    # List/Set/Tuple Testing
+    #
+    obj1 = Apprise.instantiate(url)
+    obj1._url_identifier = ['123', b'\0', 42, object]
+    # Tests dictionary key generation
+    assert obj1.url_id() is not None
+
+    obj2 = Apprise.instantiate(url)
+    obj2._url_identifier = ('123', b'\0', 42, object)
+    assert obj2.url_id() is not None
+    assert obj2.url_id() == obj2.url_id()
+
+    obj3 = Apprise.instantiate(url)
+    obj3._url_identifier = set(['123', b'\0', 42, object])
+    assert obj3.url_id() is not None
+
+    obj = Apprise.instantiate(url)
+    obj._url_identifier = b'test'
+    assert obj.url_id() is not None
+
+    obj = Apprise.instantiate(url)
+    obj._url_identifier = 'test'
+    assert obj.url_id() is not None
+
+    # Testing Garbage
+    for x in (31, object, 43.1):
+        obj = Apprise.instantiate(url)
+        obj._url_identifier = x
+        assert obj.url_id() is not None
+
+
+def test_apprise_notify_formats():
     """
     API: Apprise() Input Formats tests
 
@@ -855,12 +990,7 @@ def test_apprise_notify_formats(tmpdir):
             # Pretend everything is okay
             return True
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
     class HtmlNotification(NotifyBase):
-
         # set our default notification format
         notify_format = NotifyFormat.HTML
 
@@ -871,12 +1001,7 @@ def test_apprise_notify_formats(tmpdir):
             # Pretend everything is okay
             return True
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
     class MarkDownNotification(NotifyBase):
-
         # set our default notification format
         notify_format = NotifyFormat.MARKDOWN
 
@@ -887,10 +1012,6 @@ def test_apprise_notify_formats(tmpdir):
             # Pretend everything is okay
             return True
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
     # Store our notifications into our schema map
     N_MGR['text'] = TextNotification
     N_MGR['html'] = HtmlNotification
@@ -1074,6 +1195,9 @@ def test_apprise_asset(tmpdir):
         extension='.test') == \
         'http://localhost/default/info-256x256.test'
 
+    a = AppriseAsset(plugin_paths=('/tmp',))
+    assert a.plugin_paths == ('/tmp', )
+
 
 def test_apprise_disabled_plugins():
     """
@@ -1096,10 +1220,6 @@ def test_apprise_disabled_plugins():
         # in the next part of the testing
         service_name = 'na01'
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def notify(self, **kwargs):
             # Pretend everything is okay (so we don't break other tests)
             return True
@@ -1121,10 +1241,6 @@ def test_apprise_disabled_plugins():
             # enable state changes **AFTER** we initialize
             self.enabled = False
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def notify(self, **kwargs):
             # Pretend everything is okay (so we don't break other tests)
             return True
@@ -1179,10 +1295,6 @@ def test_apprise_disabled_plugins():
         # in the next part of the testing
         service_name = 'good'
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def send(self, **kwargs):
             # Pretend everything is okay (so we don't break other tests)
             return True
@@ -1315,10 +1427,6 @@ def test_apprise_details():
             }
         })
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def send(self, **kwargs):
             # Pretend everything is okay (so we don't break other tests)
             return True
@@ -1341,10 +1449,6 @@ def test_apprise_details():
             'packages_recommended': 'django',
         }
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def send(self, **kwargs):
             # Pretend everything is okay (so we don't break other tests)
             return True
@@ -1371,10 +1475,6 @@ def test_apprise_details():
             ]
         }
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def send(self, **kwargs):
             # Pretend everything is okay (so we don't break other tests)
             return True
@@ -1397,10 +1497,6 @@ def test_apprise_details():
             'packages_recommended': 'cryptography <= 3.4'
         }
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def send(self, **kwargs):
             # Pretend everything is okay (so we don't break other tests)
             return True
@@ -1417,10 +1513,6 @@ def test_apprise_details():
         # This is the same as saying there are no requirements
         requirements = None
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def send(self, **kwargs):
             # Pretend everything is okay (so we don't break other tests)
             return True
@@ -1439,10 +1531,6 @@ def test_apprise_details():
             'packages_recommended': 'cryptography <= 3.4'
         }
 
-        def url(self, **kwargs):
-            # Support URL
-            return ''
-
         def send(self, **kwargs):
             # Pretend everything is okay (so we don't break other tests)
             return True
@@ -1571,7 +1659,7 @@ def test_apprise_details_plugin_verification():
         # NotifyBase parameters:
         'format', 'overflow', 'emojis',
         # URLBase parameters:
-        'verify', 'cto', 'rto',
+        'verify', 'cto', 'rto', 'store',
     ])
 
     # Valid Schema Entries:
@@ -1877,7 +1965,7 @@ def test_apprise_details_plugin_verification():
 @mock.patch('asyncio.gather', wraps=asyncio.gather)
 @mock.patch('concurrent.futures.ThreadPoolExecutor',
             wraps=concurrent.futures.ThreadPoolExecutor)
-def test_apprise_async_mode(mock_threadpool, mock_gather, mock_post, tmpdir):
+def test_apprise_async_mode(mock_threadpool, mock_gather, mock_post):
     """
     API: Apprise() async_mode tests
 
diff --git a/test/test_apprise_cli.py b/test/test_apprise_cli.py
index 2eef233d..52051d5a 100644
--- a/test/test_apprise_cli.py
+++ b/test/test_apprise_cli.py
@@ -26,6 +26,7 @@
 # ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
 # POSSIBILITY OF SUCH DAMAGE.
 
+import os
 import re
 from unittest import mock
 
@@ -398,10 +399,10 @@ def test_apprise_cli_nux_env(tmpdir):
     ])
     assert result.exit_code == 0
     lines = re.split(r'[\r\n]', result.output.strip())
-    # 5 lines of all good:// entries matched
-    assert len(lines) == 5
+    # 5 lines of all good:// entries matched + url id underneath
+    assert len(lines) == 10
     # Verify we match against the remaining good:// entries
-    for i in range(0, 5):
+    for i in range(0, 10, 2):
         assert lines[i].endswith('good://')
 
     # This will fail because nothing matches mytag. It's case sensitive
@@ -676,6 +677,450 @@ def test_apprise_cli_modules(tmpdir):
     assert result.exit_code == 0
 
 
+def test_apprise_cli_persistent_storage(tmpdir):
+    """
+    CLI: test persistent storage
+
+    """
+
+    # This is a made up class that is just used to verify
+    class NoURLIDNotification(NotifyBase):
+        """
+        A no URL ID
+        """
+
+        # Update URL identifier
+        url_identifier = False
+
+        def __init__(self, *args, **kwargs):
+            super().__init__(*args, **kwargs)
+
+        def send(self, **kwargs):
+
+            # Pretend everything is okay
+            return True
+
+        def url(self, *args, **kwargs):
+            # Support URL
+            return 'noper://'
+
+        def parse_url(self, *args, **kwargs):
+            # parse our url
+            return {'schema': 'noper'}
+
+    # This is a made up class that is just used to verify
+    class TestNotification(NotifyBase):
+        """
+        A Testing Script
+        """
+
+        def __init__(self, *args, **kwargs):
+            super().__init__(*args, **kwargs)
+
+        def send(self, **kwargs):
+
+            # Test our persistent settings
+            self.store.set('key', 'value')
+            assert self.store.get('key') == 'value'
+
+            # Pretend everything is okay
+            return True
+
+        def url(self, *args, **kwargs):
+            # Support URL
+            return 'test://'
+
+        def parse_url(self, *args, **kwargs):
+            # parse our url
+            return {'schema': 'test'}
+
+    # assign test:// to our  notification defined above
+    N_MGR['test'] = TestNotification
+    N_MGR['noper'] = NoURLIDNotification
+
+    # Write a simple text based configuration file
+    config = tmpdir.join("apprise.cfg")
+    buf = cleandoc("""
+    # Create a config file we can source easily
+    test=test://
+    noper=noper://
+
+    # Define a second test URL that will
+    two-urls=test://
+
+    # Create another entry that has no tag associatd with it
+    test://?entry=2
+    """)
+    config.write(buf)
+
+    runner = CliRunner()
+
+    # Generate notification that creates persistent data
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'list',
+    ])
+    # list our entries
+    assert result.exit_code == 0
+
+    # our persist storage has not been created yet
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^1\.\s+[a-z0-9_-]{8}\s+0\.00B\s+unused\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # An invalid mode specified
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--storage-mode', 'invalid',
+        '--config', str(config),
+        '-g', 'test',
+        '-t', 'title',
+        '-b', 'body',
+    ])
+    # Bad mode specified
+    assert result.exit_code == 2
+
+    # Invalid uid lenth specified
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--storage-mode', 'flush',
+        '--storage-uid-length', 1,
+        '--config', str(config),
+        '-g', 'test',
+        '-t', 'title',
+        '-b', 'body',
+    ])
+    # storage uid length to small
+    assert result.exit_code == 2
+
+    # No files written yet; just config file exists
+    dir_content = os.listdir(str(tmpdir))
+    assert len(dir_content) == 1
+    assert 'apprise.cfg' in dir_content
+
+    # Generate notification that creates persistent data
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--storage-mode', 'flush',
+        '--config', str(config),
+        '-t', 'title',
+        '-b', 'body',
+        '-g', 'test',
+    ])
+    # We parsed our data accordingly
+    assert result.exit_code == 0
+
+    dir_content = os.listdir(str(tmpdir))
+    assert len(dir_content) == 2
+    assert 'apprise.cfg' in dir_content
+    assert 'ea482db7' in dir_content
+
+    # Have a look at our storage listings
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'list',
+    ])
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^1\.\s+[a-z0-9_-]{8}\s+81\.00B\s+active\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # keyword list is not required
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+    ])
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^1\.\s+[a-z0-9_-]{8}\s+81\.00B\s+active\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # search on something that won't match
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'list',
+        'nomatch',
+    ])
+    # list our entries
+    assert result.exit_code == 0
+
+    assert not result.stdout.strip()
+
+    # closest match search
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'list',
+        # Closest match will hit a result
+        'ea',
+    ])
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^1\.\s+[a-z0-9_-]{8}\s+81\.00B\s+active\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # list is the presumed option if no match
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        # Closest match will hit a result
+        'ea',
+    ])
+    # list our entries successfully again..
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^1\.\s+[a-z0-9_-]{8}\s+81\.00B\s+active\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # Search based on tag
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'list',
+        # We can match by tags too
+        '-g', 'test',
+    ])
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^1\.\s+[a-z0-9_-]{8}\s+81\.00B\s+active\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # Prune call but prune-days set incorrectly
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--storage-prune-days', -1,
+        'storage',
+        'prune',
+    ])
+    # storage prune days is invalid
+    assert result.exit_code == 2
+
+    # Create a tmporary namespace
+    tmpdir.mkdir('namespace')
+
+    # Generates another listing
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+    ])
+
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^[0-9]\.\s+[a-z0-9_-]{8}\s+81\.00B\s+active\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+    assert re.match(
+        r'.*\s*[0-9]\.\s+namespace\s+0\.00B\s+stale.*', _stdout,
+        (re.MULTILINE | re.DOTALL))
+
+    # Generates another listing but utilize the tag
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        '--tag', 'test',
+        'storage',
+    ])
+
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^[0-9]\.\s+[a-z0-9_-]{8}\s+81\.00B\s+active\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+    assert re.match(
+        r'.*\s*[0-9]\.\s+namespace\s+0\.00B\s+stale.*', _stdout,
+        (re.MULTILINE | re.DOTALL)) is None
+
+    # Clear all of our accumulated disk space
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'clear',
+    ])
+
+    # successful
+    assert result.exit_code == 0
+
+    # Generate another listing
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+    ])
+
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    # back to unused state and 0 bytes
+    assert re.match(
+        r'^[0-9]\.\s+[a-z0-9_-]{8}\s+0\.00B\s+unused\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+    # namespace is gone now
+    assert re.match(
+        r'.*\s*[0-9]\.\s+namespace\s+0\.00B\s+stale.*', _stdout,
+        (re.MULTILINE | re.DOTALL)) is None
+
+    # Provide both tags and uid
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'ea',
+        '-g', 'test',
+    ])
+
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    # back to unused state and 0 bytes
+    assert re.match(
+        r'^[0-9]\.\s+[a-z0-9_-]{8}\s+0\.00B\s+unused\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # Generate notification that creates persistent data
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--storage-mode', 'flush',
+        '--config', str(config),
+        '-t', 'title',
+        '-b', 'body',
+        '-g', 'test',
+    ])
+    # We parsed our data accordingly
+    assert result.exit_code == 0
+
+    # Have a look at our storage listings
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'list',
+    ])
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^1\.\s+[a-z0-9_-]{8}\s+81\.00B\s+active\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # Prune call but prune-days set incorrectly
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        'storage',
+        'prune',
+    ])
+
+    # Run our prune successfully
+    assert result.exit_code == 0
+
+    # Have a look at our storage listings (expected no change in output)
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'list',
+    ])
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^1\.\s+[a-z0-9_-]{8}\s+81\.00B\s+active\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # Prune call but prune-days set incorrectly
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        # zero simulates a full clean
+        '--storage-prune-days', 0,
+        'storage',
+        'prune',
+    ])
+
+    # Run our prune successfully
+    assert result.exit_code == 0
+
+    # Have a look at our storage listings (expected no change in output)
+    result = runner.invoke(cli.main, [
+        '--storage-path', str(tmpdir),
+        '--config', str(config),
+        'storage',
+        'list',
+    ])
+    # list our entries
+    assert result.exit_code == 0
+
+    _stdout = result.stdout.strip()
+    assert re.match(
+        r'^1\.\s+[a-z0-9_-]{8}\s+0\.00B\s+unused\s+-\s+test://$', _stdout,
+        re.MULTILINE)
+
+    # New Temporary namespace
+    new_persistent_base = tmpdir.mkdir('namespace')
+    with environ(APPRISE_STORAGE=str(new_persistent_base)):
+        # Reload our module
+        reload(cli)
+
+        # Nothing in our directory yet
+        dir_content = os.listdir(str(new_persistent_base))
+        assert len(dir_content) == 0
+
+        # Generate notification that creates persistent data
+        # storage path is pulled out of our environment variable
+        result = runner.invoke(cli.main, [
+            '--storage-mode', 'flush',
+            '--config', str(config),
+            '-t', 'title',
+            '-b', 'body',
+            '-g', 'test',
+        ])
+        # We parsed our data accordingly
+        assert result.exit_code == 0
+
+        # Now content exists
+        dir_content = os.listdir(str(new_persistent_base))
+        assert len(dir_content) == 1
+
+    # Reload our module with our environment variable gone
+    reload(cli)
+
+    # Clear loaded modules
+    N_MGR.unload_modules()
+
+
 def test_apprise_cli_details(tmpdir):
     """
     CLI: --details (-l)
diff --git a/test/test_apprise_utils.py b/test/test_apprise_utils.py
index 39293b01..aa7be179 100644
--- a/test/test_apprise_utils.py
+++ b/test/test_apprise_utils.py
@@ -29,6 +29,7 @@
 import re
 import os
 import sys
+from unittest import mock
 from inspect import cleandoc
 from urllib.parse import unquote
 
@@ -2740,3 +2741,174 @@ def test_cwe312_url():
     assert utils.cwe312_url(
         'slack://test@B4QP3WWB4/J3QWT41JM/XIl2ffpqXkzkwMXrJdevi7W3/'
         '#random') == 'slack://test@B...4/J...M/X...3/'
+
+
+def test_dict_base64_codec(tmpdir):
+    """
+    Test encoding/decoding of base64 content
+    """
+    original = {
+        'int': 1,
+        'float': 2.3,
+    }
+
+    encoded, needs_decoding = utils.encode_b64_dict(original)
+    assert encoded == {'int': 'b64:MQ==', 'float': 'b64:Mi4z'}
+    assert needs_decoding is True
+    decoded = utils.decode_b64_dict(encoded)
+    assert decoded == original
+
+    with mock.patch('json.dumps', side_effect=TypeError()):
+        encoded, needs_decoding = utils.encode_b64_dict(original)
+        # we failed
+        assert needs_decoding is False
+        assert encoded == {
+            'int': '1',
+            'float': '2.3',
+        }
+
+
+def test_dir_size(tmpdir):
+    """
+    Test dir size tool
+    """
+
+    # Nothing to find/see
+    size, _errors = utils.dir_size(str(tmpdir))
+    assert size == 0
+    assert len(_errors) == 0
+
+    # Write a file in our root directory
+    tmpdir.join('root.psdata').write('0' * 1024 * 1024)
+
+    # Prepare some more directories
+    namespace_1 = tmpdir.mkdir('abcdefg')
+    namespace_2 = tmpdir.mkdir('defghij')
+    namespace_2.join('cache.psdata').write('0' * 1024 * 1024)
+    size, _errors = utils.dir_size(str(tmpdir))
+    assert size == 1024 * 1024 * 2
+    assert len(_errors) == 0
+
+    # Write another file
+    namespace_1.join('cache.psdata').write('0' * 1024 * 1024)
+    size, _errors = utils.dir_size(str(tmpdir))
+    assert size == 1024 * 1024 * 3
+    assert len(_errors) == 0
+
+    size, _errors = utils.dir_size(str(namespace_1))
+    assert size == 1024 * 1024
+    assert len(_errors) == 0
+
+    # Create a directory insde one of our namespaces
+    subspace_1 = namespace_1.mkdir('zyx')
+    size, _errors = utils.dir_size(str(namespace_1))
+    assert size == 1024 * 1024
+
+    subspace_1.join('cache.psdata').write('0' * 1024 * 1024)
+    size, _errors = utils.dir_size(str(tmpdir))
+    assert size == 1024 * 1024 * 4
+    assert len(_errors) == 0
+
+    # Recursion limit reduced... no change at 2 as we can go 2
+    # diretories deep no problem
+    size, _errors = utils.dir_size(str(tmpdir), max_depth=2)
+    assert size == 1024 * 1024 * 4
+    assert len(_errors) == 0
+
+    size, _errors = utils.dir_size(str(tmpdir), max_depth=1)
+    assert size == 1024 * 1024 * 3
+    # we can't get into our subspace_1
+    assert len(_errors) == 1
+    assert str(subspace_1) in _errors
+
+    size, _errors = utils.dir_size(str(tmpdir), max_depth=0)
+    assert size == 1024 * 1024
+    # we can't get into our namespace directories
+    assert len(_errors) == 2
+    assert str(namespace_1) in _errors
+    assert str(namespace_2) in _errors
+
+    # Let's cause problems now and test the output
+    size, _errors = utils.dir_size('invalid-directory', missing_okay=True)
+    assert size == 0
+    assert len(_errors) == 0
+
+    size, _errors = utils.dir_size('invalid-directory', missing_okay=False)
+    assert size == 0
+    assert len(_errors) == 1
+    assert 'invalid-directory' in _errors
+
+    with mock.patch('os.scandir', side_effect=OSError()):
+        size, _errors = utils.dir_size(str(tmpdir), missing_okay=True)
+        assert size == 0
+        assert len(_errors) == 1
+        assert str(tmpdir) in _errors
+
+    with mock.patch('os.scandir') as mock_scandir:
+        mock_entry = mock.MagicMock()
+        mock_entry.is_file.side_effect = OSError()
+        mock_entry.path = '/test/path'
+        # Mock the scandir return value to yield the mock entry
+        mock_scandir.return_value.__enter__.return_value = [mock_entry]
+
+        size, _errors = utils.dir_size(str(tmpdir))
+        assert size == 0
+        assert len(_errors) == 1
+        assert mock_entry.path in _errors
+
+    with mock.patch('os.scandir') as mock_scandir:
+        mock_entry = mock.MagicMock()
+        mock_entry.is_file.return_value = False
+        mock_entry.is_dir.side_effect = OSError()
+        mock_entry.path = '/test/path'
+        # Mock the scandir return value to yield the mock entry
+        mock_scandir.return_value.__enter__.return_value = [mock_entry]
+        size, _errors = utils.dir_size(str(tmpdir))
+        assert len(_errors) == 1
+        assert mock_entry.path in _errors
+
+    with mock.patch('os.scandir') as mock_scandir:
+        mock_entry = mock.MagicMock()
+        mock_entry.is_file.return_value = False
+        mock_entry.is_dir.return_value = False
+        # Mock the scandir return value to yield the mock entry
+        mock_scandir.return_value.__enter__.return_value = [mock_entry]
+        size, _errors = utils.dir_size(str(tmpdir))
+        assert size == 0
+        assert len(_errors) == 0
+
+    with mock.patch('os.scandir') as mock_scandir:
+        mock_entry = mock.MagicMock()
+        mock_entry.is_file.side_effect = FileNotFoundError()
+        mock_entry.path = '/test/path'
+        # Mock the scandir return value to yield the mock entry
+        mock_scandir.return_value.__enter__.return_value = [mock_entry]
+
+        size, _errors = utils.dir_size(str(tmpdir))
+        assert size == 0
+        # No file isn't a problem, we're calculating disksize anyway,
+        # one less thing to calculate
+        assert len(_errors) == 0
+
+
+def test_bytes_to_str():
+    """
+    Test Bytes to String representation
+    """
+    # Garbage Entry
+    assert utils.bytes_to_str(None) is None
+    assert utils.bytes_to_str('') is None
+    assert utils.bytes_to_str('GARBAGE') is None
+
+    # Good Entries
+    assert utils.bytes_to_str(0) == "0.00B"
+    assert utils.bytes_to_str(1) == "1.00B"
+    assert utils.bytes_to_str(1.1) == "1.10B"
+    assert utils.bytes_to_str(1024) == "1.00KB"
+    assert utils.bytes_to_str(1024 * 1024) == "1.00MB"
+    assert utils.bytes_to_str(1024 * 1024 * 1024) == "1.00GB"
+    assert utils.bytes_to_str(1024 * 1024 * 1024 * 1024) == "1.00TB"
+
+    # Support strings too
+    assert utils.bytes_to_str("0") == "0.00B"
+    assert utils.bytes_to_str("1024") == "1.00KB"
diff --git a/test/test_attach_http.py b/test/test_attach_http.py
index ad58ed91..36ecbad5 100644
--- a/test/test_attach_http.py
+++ b/test/test_attach_http.py
@@ -86,15 +86,21 @@ def test_attach_http_query_string_dictionary():
 
     """
 
-    # no qsd specified
-    results = AttachHTTP.parse_url('http://localhost')
+    # Set verify off
+    results = AttachHTTP.parse_url('http://localhost?verify=no&rto=9&cto=8')
     assert isinstance(results, dict)
 
     # Create our object
     obj = AttachHTTP(**results)
     assert isinstance(obj, AttachHTTP)
 
-    assert re.search(r'[?&]verify=yes', obj.url())
+    # verify is disabled and therefore set
+    assert re.search(r'[?&]verify=no', obj.url())
+
+    # Our connect timeout flag is set since it differs from the default
+    assert re.search(r'[?&]cto=8', obj.url())
+    # Our read timeout flag is set since it differs from the default
+    assert re.search(r'[?&]rto=9', obj.url())
 
     # Now lets create a URL with a custom Query String entry
 
@@ -106,7 +112,8 @@ def test_attach_http_query_string_dictionary():
     obj = AttachHTTP(**results)
     assert isinstance(obj, AttachHTTP)
 
-    assert re.search(r'[?&]verify=yes', obj.url())
+    # verify is not in the URL as it is implied (default)
+    assert not re.search(r'[?&]verify=yes', obj.url())
 
     # But now test that our custom arguments have also been set
     assert re.search(r'[?&]dl=1', obj.url())
diff --git a/test/test_persistent_store.py b/test/test_persistent_store.py
new file mode 100644
index 00000000..20cabb52
--- /dev/null
+++ b/test/test_persistent_store.py
@@ -0,0 +1,1522 @@
+# -*- coding: utf-8 -*-
+# BSD 2-Clause License
+#
+# Apprise - Push Notification Library.
+# Copyright (c) 2024, Chris Caron <lead2gold@gmail.com>
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are met:
+#
+# 1. Redistributions of source code must retain the above copyright notice,
+#    this list of conditions and the following disclaimer.
+#
+# 2. Redistributions in binary form must reproduce the above copyright notice,
+#    this list of conditions and the following disclaimer in the documentation
+#    and/or other materials provided with the distribution.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
+# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
+# POSSIBILITY OF SUCH DAMAGE.
+
+import time
+import os
+import zlib
+import pytest
+import shutil
+import json
+import gzip
+from unittest import mock
+from datetime import datetime, timedelta, timezone
+from apprise import exception
+from apprise.asset import AppriseAsset
+from apprise.persistent_store import (
+    CacheJSONEncoder, CacheObject, PersistentStore, PersistentStoreMode)
+
+# Disable logging for a cleaner testing output
+import logging
+logging.disable(logging.CRITICAL)
+
+# Attachment Directory
+TEST_VAR_DIR = os.path.join(os.path.dirname(__file__), 'var')
+
+
+def test_persistent_storage_asset(tmpdir):
+    """
+    Tests the Apprise Asset Object when setting the Persistent Store
+    """
+
+    asset = AppriseAsset(storage_path=str(tmpdir))
+    assert asset.storage_path == str(tmpdir)
+    assert asset.storage_mode is PersistentStoreMode.AUTO
+
+    # If there is no storage path, we're always set to memory
+    asset = AppriseAsset(
+        storage_path=None, storage_mode=PersistentStoreMode.MEMORY)
+    assert asset.storage_path is None
+    assert asset.storage_mode is PersistentStoreMode.MEMORY
+
+
+def test_disabled_persistent_storage(tmpdir):
+    """
+    Persistent Storage General Testing
+
+    """
+    # Create ourselves an attachment object set in Memory Mode only
+    pc = PersistentStore(
+        namespace='abc', path=str(tmpdir), mode=PersistentStoreMode.MEMORY)
+    assert pc.read() is None
+    assert pc.read('mykey') is None
+    with pytest.raises(AttributeError):
+        # Invalid key specified
+        pc.read('!invalid')
+    assert pc.write('data') is False
+    assert pc.get('key') is None
+    assert pc.set('key', 'value')
+    assert pc.get('key') == 'value'
+
+    assert pc.set('key2', 'value')
+    pc.clear('key', 'key-not-previously-set')
+    assert pc.get('key2') == 'value'
+    assert pc.get('key') is None
+
+    # Set it again
+    assert pc.set('key', 'another-value')
+    # Clears all
+    pc.clear()
+    assert pc.get('key2') is None
+    assert pc.get('key') is None
+    # A second call to clear on an already empty cache set
+    pc.clear()
+
+    # No dirty flag is set as ther is nothing to write to disk
+    pc.set('not-persistent', 'value', persistent=False)
+    del pc['not-persistent']
+    with pytest.raises(KeyError):
+        # Can't delete it twice
+        del pc['not-persistent']
+
+    # A Persistent key
+    pc.set('persistent', 'value')
+    # Removes it and sets/clears the dirty flag
+    del pc['persistent']
+
+    # After all of the above, nothing was done to the directory
+    assert len(os.listdir(str(tmpdir))) == 0
+
+    with pytest.raises(AttributeError):
+        # invalid persistent store specified
+        PersistentStore(
+            namespace='abc', path=str(tmpdir), mode='garbage')
+
+
+def test_persistent_storage_init(tmpdir):
+    """
+    Test storage initialization
+    """
+    with pytest.raises(AttributeError):
+        PersistentStore(namespace="", path=str(tmpdir))
+    with pytest.raises(AttributeError):
+        PersistentStore(namespace=None, path=str(tmpdir))
+
+    with pytest.raises(AttributeError):
+        PersistentStore(namespace="_", path=str(tmpdir))
+    with pytest.raises(AttributeError):
+        PersistentStore(namespace=".", path=str(tmpdir))
+    with pytest.raises(AttributeError):
+        PersistentStore(namespace="-", path=str(tmpdir))
+
+    with pytest.raises(AttributeError):
+        PersistentStore(namespace="_abc", path=str(tmpdir))
+    with pytest.raises(AttributeError):
+        PersistentStore(namespace=".abc", path=str(tmpdir))
+    with pytest.raises(AttributeError):
+        PersistentStore(namespace="-abc", path=str(tmpdir))
+
+    with pytest.raises(AttributeError):
+        PersistentStore(namespace="%", path=str(tmpdir))
+
+
+def test_persistent_storage_general(tmpdir):
+    """
+    Persistent Storage General Testing
+
+    """
+    namespace = 'abc'
+    # Create ourselves an attachment object
+    pc = PersistentStore()
+
+    # Default mode when a path is not provided
+    assert pc.mode == PersistentStoreMode.MEMORY
+
+    assert pc.size() == 0
+    assert pc.files() == []
+    assert pc.files(exclude=True, lazy=False) == []
+    assert pc.files(exclude=False, lazy=False) == []
+    pc.set('key', 'value')
+    # There is no disk size utilized
+    assert pc.size() == 0
+    assert pc.files(exclude=True, lazy=False) == []
+    assert pc.files(exclude=False, lazy=False) == []
+
+    # Create ourselves an attachment object
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir))
+
+    # Default mode when a path is provided
+    assert pc.mode == PersistentStoreMode.AUTO
+
+    # Get our path associated with our Persistent Store
+    assert pc.path == os.path.join(str(tmpdir), 'abc')
+
+    # Expiry testing
+    assert pc.set('key', 'value', datetime.now() + timedelta(hours=1))
+    # i min in the future
+    assert pc.set('key', 'value', 60)
+
+    with pytest.raises(AttributeError):
+        assert pc.set('key', 'value', 'invalid')
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir))
+
+    # Our key is still valid and we load it from disk
+    assert pc.get('key') == 'value'
+    assert pc['key'] == 'value'
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir))
+    assert pc.keys()
+    # Second call after already initialized skips over initialization
+    assert pc.keys()
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir))
+
+    with pytest.raises(KeyError):
+        # The below
+        pc['unassigned_key']
+
+
+def test_persistent_storage_auto_mode(tmpdir):
+    """
+    Persistent Storage Auto Write Testing
+
+    """
+    namespace = 'abc'
+    # Create ourselves an attachment object
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.AUTO)
+
+    pc.write(b'test')
+    with mock.patch('os.unlink', side_effect=FileNotFoundError()):
+        assert pc.delete(all=True) is True
+
+    # Create a temporary file we can delete
+    with open(os.path.join(pc.path, pc.temp_dir, 'test.file'), 'wb') as fd:
+        fd.write(b'data')
+
+    # Delete just the temporary files
+    assert pc.delete(temp=True) is True
+
+    # Delete just the temporary files
+    # Create a cache entry and delete it
+    assert pc.set('key', 'value') is True
+    pc.write(b'test')
+    assert pc.delete(cache=True) is True
+    # Verify our data entry wasn't removed
+    assert pc.read() == b'test'
+    # But our cache was
+    assert pc.get('key') is None
+
+    # A reverse of the above... create a cache an data variable and
+    # Clear the data; make sure our cache is still there
+    assert pc.set('key', 'value') is True
+    pc.write(b'test', key='iokey') is True
+    assert pc.delete('iokey') is True
+    assert pc.get('key') == 'value'
+    assert pc.read('iokey') is None
+
+
+def test_persistent_storage_flush_mode(tmpdir):
+    """
+    Persistent Storage Forced Write Testing
+
+    """
+    namespace = 'abc'
+    # Create ourselves an attachment object
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # Reference path
+    path = os.path.join(str(tmpdir), namespace)
+
+    assert pc.size() == 0
+    assert list(pc.files()) == []
+
+    # Key is not set yet
+    assert pc.get('key') is None
+    assert len(pc.keys()) == 0
+    assert 'key' not in pc
+
+    # Verify our data is set
+    assert pc.set('key', 'value')
+    assert len(pc.keys()) == 1
+    assert 'key' in list(pc.keys())
+
+    assert pc.size() > 0
+    assert len(pc.files()) == 1
+
+    # Second call uses Lazy cache
+    # Just our cache file
+    assert len(pc.files()) == 1
+
+    # Setting the same value again uses a lazy mode and
+    # bypasses all of the write overhead
+    assert pc.set('key', 'value')
+
+    path_content = os.listdir(path)
+    # var, cache.psdata, and tmp
+    assert len(path_content) == 3
+
+    # Assignments (causes another disk write)
+    pc['key'] = 'value2'
+
+    # Setting the same value and explictly marking the field as not being
+    # perisistent
+    pc.set('key-xx', 'abc123', persistent=False)
+    # Changing it's value doesn't alter the persistent flag
+    pc['key-xx'] = 'def678'
+    # Setting it twice
+    pc['key-xx'] = 'def678'
+
+    # Our retrievals
+    assert pc['key-xx'] == 'def678'
+    assert pc.get('key-xx') == 'def678'
+
+    # But on the destruction of our object, it is not available again
+    del pc
+    # Create ourselves an attachment object
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    assert pc.get('key-xx') is None
+    with pytest.raises(KeyError):
+        pc['key-xx']
+
+    # Now our key is set
+    assert 'key' in pc
+    assert pc.get('key') == 'value2'
+
+    # A directory was created identified by the namespace
+    assert len(os.listdir(str(tmpdir))) == 1
+    assert namespace in os.listdir(str(tmpdir))
+
+    path_content = os.listdir(path)
+    assert len(path_content) == 4
+
+    # Another write doesn't change the file count
+    pc['key'] = 'value3'
+    path_content = os.listdir(path)
+    assert len(path_content) == 4
+
+    # Our temporary directory used for all file handling in this namespace
+    assert pc.temp_dir in path_content
+    # Our cache file
+    assert os.path.basename(pc.cache_file) in path_content
+
+    path = os.path.join(pc.path, pc.temp_dir)
+    path_content = os.listdir(path)
+
+    # We always do our best to clean any temporary files up
+    assert len(path_content) == 0
+
+    # Destroy our object
+    del pc
+
+    # Re-initialize it
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # Our key is persistent and available right away
+    assert pc.get('key') == 'value3'
+    assert 'key' in pc
+
+    # Remove our item
+    del pc['key']
+    assert pc.size() == 0
+    assert 'key' not in pc
+
+    assert pc.write('data') is True
+    assert pc.read() == b'data'
+    assert pc.write(b'data') is True
+    assert pc.read() == b'data'
+
+    assert pc.read('default') == b'data'
+    assert pc.write('data2', key='mykey') is True
+    assert pc.read('mykey') == b'data2'
+
+    # We can selectively delete our key
+    assert pc.delete('mykey')
+    assert pc.read('mykey') is None
+    # Other keys are not touched
+    assert pc.read('default') == b'data'
+    assert pc.read() == b'data'
+    # Full purge
+    assert pc.delete()
+    assert pc.read('mykey') is None
+    assert pc.read() is None
+
+    # Practice with files
+    with open(os.path.join(TEST_VAR_DIR, 'apprise-test.gif'), 'rb') as fd:
+        assert pc.write(fd, key='mykey', compress=False) is True
+
+        # Read our content back
+        fd.seek(0)
+        assert pc.read('mykey', compress=False) == fd.read()
+
+    with open(os.path.join(TEST_VAR_DIR, 'apprise-test.gif'), 'rb') as fd:
+        assert pc.write(fd, key='mykey', compress=True) is True
+
+        # Read our content back; content will be compressed
+        fd.seek(0)
+        assert pc.read('mykey', compress=True) == fd.read()
+
+    class Foobar:
+        def read(*args, **kwargs):
+            return 42
+
+    foobar = Foobar()
+    # read() returns a non string/bin
+    with pytest.raises(exception.AppriseDiskIOError):
+        pc.write(foobar, key='foobar', compress=True)
+    assert pc.read('foobar') is None
+
+    class Foobar:
+        def read(*args, **kwargs):
+            return 'good'
+
+    foobar = Foobar()
+    # read() returns a string so the below write works
+    assert pc.write(foobar, key='foobar', compress=True)
+    assert pc.read('foobar') == b'good'
+    pc.delete()
+
+    class Foobar:
+        def read(*args, **kwargs):
+            # Throw an exception
+            raise TypeError()
+
+    foobar = Foobar()
+    # read() returns a non string/bin
+    with pytest.raises(exception.AppriseDiskIOError):
+        pc.write(foobar, key='foobar', compress=True)
+    assert pc.read('foobar') is None
+
+    # Set our max_file_size
+    _prev_max_file_size = pc.max_file_size
+    pc.max_file_size = 1
+    assert pc.delete()
+
+    assert pc.write('data') is False
+    assert pc.read() is None
+
+    # Restore setting
+    pc.max_file_size = _prev_max_file_size
+
+    # Reset
+    pc.delete()
+
+    assert pc.write('data')
+    # Corrupt our data
+    data = pc.read(compress=False)[:20] + pc.read(compress=False)[:10]
+    pc.write(data, compress=False)
+
+    # Now we'll get an exception reading back the corrupted data
+    assert pc.read() is None
+
+    # Keep in mind though the data is still there; operator should write
+    # and read the way they expect to and things will work out fine
+    # This test just proves that Apprise Peresistent storage still
+    # gracefully handles bad data
+    assert pc.read(compress=False) == data
+
+    # No key exists also returns None
+    assert pc.read('no-key-exists') is None
+
+    pc.write(b'test')
+    pc['key'] = 'value'
+    with mock.patch('os.unlink', side_effect=FileNotFoundError()):
+        assert pc.delete(all=True) is True
+    with mock.patch('os.unlink', side_effect=OSError()):
+        assert pc.delete(all=True) is False
+
+    # Create a temporary file we can delete
+    tmp_file = os.path.join(pc.path, pc.temp_dir, 'test.file')
+    with open(tmp_file, 'wb') as fd:
+        fd.write(b'data')
+
+    assert pc.set('key', 'value') is True
+    pc.write(b'test', key='iokey') is True
+    # Delete just the temporary files
+    assert pc.delete(temp=True) is True
+    assert os.path.exists(tmp_file) is False
+    # our other entries are untouched
+    assert pc.get('key') == 'value'
+    assert pc.read('iokey') == b'test'
+
+    # Delete just the temporary files
+    # Create a cache entry and delete it
+    assert pc.set('key', 'value') is True
+    pc.write(b'test')
+    assert pc.delete(cache=True) is True
+    # Verify our data entry wasn't removed
+    assert pc.read() == b'test'
+    # But our cache was
+    assert pc.get('key') is None
+
+    # A reverse of the above... create a cache an data variable and
+    # Clear the data; make sure our cache is still there
+    assert pc.set('key', 'value') is True
+    pc.write(b'test', key='iokey') is True
+    assert pc.delete('iokey') is True
+    assert pc.get('key') == 'value'
+    assert pc.read('iokey') is None
+
+    # Create some custom files
+    cust1_file = os.path.join(pc.path, 'test.file')
+    cust2_file = os.path.join(pc.path, pc.data_dir, 'test.file')
+    with open(cust1_file, 'wb') as fd:
+        fd.write(b'data')
+    with open(cust2_file, 'wb') as fd:
+        fd.write(b'data')
+
+    # Even after a full flush our files will exist
+    assert pc.delete()
+    assert os.path.exists(cust1_file) is True
+    assert os.path.exists(cust2_file) is True
+
+    # However, if we turn off validate, we do a full sweep because these
+    # unknown files are lingering in our directory space
+    assert pc.delete(validate=False)
+    assert os.path.exists(cust1_file) is False
+    assert os.path.exists(cust2_file) is False
+
+    pc['key'] = 'value'
+    pc['key2'] = 'value2'
+    assert 'key' in pc
+    assert 'key2' in pc
+    pc.clear('key')
+    assert 'key' not in pc
+    assert 'key2' in pc
+
+    # Set expired content
+    pc.set(
+        'expired', 'expired-content',
+        expires=datetime.now() - timedelta(days=1))
+
+    # It's actually there... but it's expired so our persistent
+    # storage is behaving as it should
+    assert 'expired' not in pc
+    assert pc.get('expired') is None
+    # Prune our content
+    pc.prune()
+
+
+def test_persistent_storage_corruption_handling(tmpdir):
+    """
+    Test corrupting handling of storage
+    """
+
+    # Namespace
+    namespace = 'def456'
+
+    # Initialize it
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    cache_file = pc.cache_file
+    assert not os.path.isfile(cache_file)
+
+    # Store our key
+    pc['mykey'] = 42
+    assert os.path.isfile(cache_file)
+
+    with gzip.open(cache_file, 'rb') as f:
+        # Read our content from disk
+        json.loads(f.read().decode('utf-8'))
+
+    # Remove object
+    del pc
+
+    # Corrupt the file
+    with open(cache_file, 'wb') as f:
+        f.write(b'{')
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # File is corrupted
+    assert 'mykey' not in pc
+    pc['mykey'] = 42
+    del pc
+
+    # File is corrected now
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    assert 'mykey' in pc
+
+    # Corrupt the file again
+    with gzip.open(cache_file, 'wb') as f:
+        # Bad JSON File
+        f.write(b'{')
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # File is corrupted
+    assert 'mykey' not in pc
+    pc['mykey'] = 42
+    del pc
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # Test our force flush
+    assert pc.flush(force=True) is True
+    # double call
+    assert pc.flush(force=True) is True
+
+    # Zlib error handling as well during open
+    with mock.patch('gzip.open', side_effect=OSError()):
+        with pytest.raises(KeyError):
+            pc['mykey'] = 43
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # Zlib error handling as well during open
+    with mock.patch('gzip.open', side_effect=OSError()):
+        # No keys can be returned
+        assert not pc.keys()
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    with mock.patch('json.loads', side_effect=TypeError()):
+        with mock.patch('os.unlink', side_effect=FileNotFoundError()):
+            with pytest.raises(KeyError):
+                pc['mykey'] = 44
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    with mock.patch('json.loads', side_effect=TypeError()):
+        with mock.patch('os.unlink', side_effect=OSError()):
+            with pytest.raises(KeyError):
+                pc['mykey'] = 45
+
+    pc['my-new-key'] = 43
+    with mock.patch('gzip.open', side_effect=OSError()):
+        # We will fail to flush our content to disk
+        assert pc.flush(force=True) is False
+
+    with mock.patch('json.dumps', side_effect=TypeError()):
+        # We will fail to flush our content to disk
+        assert pc.flush(force=True) is False
+
+    with mock.patch('os.makedirs', side_effect=OSError()):
+        pc = PersistentStore(
+            namespace=namespace, path=str(tmpdir),
+            mode=PersistentStoreMode.FLUSH)
+
+        # Directory initialization failed so we fall back to memory mode
+        assert pc.mode == PersistentStoreMode.MEMORY
+
+    # Handle file updates
+    pc = PersistentStore(
+        namespace='file-time-refresh', path=str(tmpdir),
+        mode=PersistentStoreMode.AUTO)
+
+    pc['test'] = 'abcd'
+    assert pc.write(b'data', key='abcd') is True
+    assert pc.read('abcd', expires=True) == b'data'
+    assert pc.write(b'data2', key='defg') is True
+    assert pc.read('defg', expires=False) == b'data2'
+    assert pc.write(b'data3', key='hijk') is True
+    assert pc.read('hijk', expires=False) == b'data3'
+    assert pc['test'] == 'abcd'
+
+    with mock.patch('os.utime', side_effect=(OSError(), FileNotFoundError())):
+        pc.flush()
+
+    # directory initialization okay
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    assert 'mykey' not in pc
+    pc['mykey'] = 42
+    del pc
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+    assert 'mykey' in pc
+
+    # Remove the last entry
+    del pc['mykey']
+    with mock.patch('os.rename', side_effect=OSError()):
+        with mock.patch('os.unlink', side_effect=OSError()):
+            assert not pc.flush(force=True)
+
+    # Create another entry
+    pc['mykey'] = 42
+    with mock.patch('tempfile.NamedTemporaryFile', side_effect=OSError()):
+        assert not pc.flush(force=True)
+
+        # Temporary file cleanup failure
+        with mock.patch('tempfile._TemporaryFileWrapper.close',
+                        side_effect=OSError()):
+            assert not pc.flush(force=True)
+
+    # Create another entry
+    pc['mykey'] = 43
+    mock_ntf = mock.MagicMock()
+    mock_ntf.name = os.path.join(tmpdir, 'file')
+
+    #
+    # Recursion loop checking
+    #
+    with mock.patch(
+            'tempfile.NamedTemporaryFile',
+            side_effect=[FileNotFoundError(), FileNotFoundError(), mock_ntf]):
+        # No way to have recursion loop
+        assert not pc.flush(force=True, _recovery=True)
+
+    with mock.patch(
+            'tempfile.NamedTemporaryFile',
+            side_effect=[FileNotFoundError(), FileNotFoundError(), mock_ntf]):
+        # No way to have recursion loop
+        assert not pc.flush(force=False, _recovery=True)
+
+    with mock.patch(
+            'tempfile.NamedTemporaryFile',
+            side_effect=[FileNotFoundError(), FileNotFoundError(), mock_ntf]):
+        # No way to have recursion loop
+        assert not pc.flush(force=False, _recovery=False)
+
+    with mock.patch(
+            'tempfile.NamedTemporaryFile',
+            side_effect=[FileNotFoundError(), FileNotFoundError(), mock_ntf]):
+        # No way to have recursion loop
+        assert not pc.flush(force=True, _recovery=False)
+
+    with mock.patch('tempfile._TemporaryFileWrapper.close',
+                    side_effect=(OSError(), None)):
+        with mock.patch('os.unlink', side_effect=(OSError())):
+            assert not pc.flush(force=True)
+
+    with mock.patch(
+            'tempfile._TemporaryFileWrapper.close', side_effect=OSError()):
+        assert not pc.flush(force=True)
+
+    with mock.patch(
+            'tempfile._TemporaryFileWrapper.close',
+            side_effect=(OSError(), None)):
+        with mock.patch('os.unlink', side_effect=OSError()):
+            assert not pc.flush(force=True)
+
+    with mock.patch(
+            'tempfile._TemporaryFileWrapper.close',
+            side_effect=(OSError(), None)):
+        with mock.patch('os.unlink', side_effect=FileNotFoundError()):
+            assert not pc.flush(force=True)
+
+    del pc
+
+    # directory initialization okay
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # Allows us to play with encoding errors
+    pc.encoding = 'ascii'
+
+    # Handle write() calls
+    with mock.patch('os.stat', side_effect=OSError()):
+        # We fail to fetch the filesize of our old file causing us to fail
+        assert pc.write('abcd') is False
+
+    # γƒœγƒΌγƒ«γƒˆ translates to vault (no bad word here) :)
+    data = "γƒœγƒΌγƒ«γƒˆ"
+
+    # We'll have encoding issues
+    assert pc.write(data) is False
+
+    with mock.patch('gzip.open', side_effect=FileNotFoundError()):
+        pc = PersistentStore(namespace=namespace, path=str(tmpdir))
+
+        # recovery mode will kick in and even it will fail
+        assert pc.write(b'key') is False
+
+    with mock.patch('gzip.open', side_effect=OSError()):
+        pc = PersistentStore(namespace=namespace, path=str(tmpdir))
+
+        # Falls to default
+        assert pc.get('key') is None
+
+        pc = PersistentStore(namespace=namespace, path=str(tmpdir))
+        with pytest.raises(KeyError):
+            pc['key'] = 'value'
+
+        pc = PersistentStore(namespace=namespace, path=str(tmpdir))
+        with pytest.raises(KeyError):
+            pc['key']
+
+        pc = PersistentStore(namespace=namespace, path=str(tmpdir))
+        with pytest.raises(KeyError):
+            del pc['key']
+
+        pc = PersistentStore(namespace=namespace, path=str(tmpdir))
+        # Fails to set key
+        assert pc.set('key', 'value') is False
+
+        pc = PersistentStore(namespace=namespace, path=str(tmpdir))
+        # Fails to clear
+        assert pc.clear() is False
+
+        pc = PersistentStore(namespace=namespace, path=str(tmpdir))
+        # Fails to prune
+        assert pc.prune() is False
+
+    # Set some expired content
+    pc.set(
+        'key', 'value', persistent=False,
+        expires=datetime.now() - timedelta(days=1))
+    pc.set(
+        'key2', 'value2', persistent=True,
+        expires=datetime.now() - timedelta(days=1))
+
+    # Set some un-expired content
+    pc.set('key3', 'value3', persistent=True)
+    pc.set('key4', 'value4', persistent=False)
+    assert pc.prune() is True
+
+    # Second call has no change made
+    assert pc.prune() is False
+
+    # Reset
+    pc.delete()
+
+    # directory initialization okay
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # Write some content that expires almost immediately
+    pc.set(
+        'key1', 'value', persistent=True,
+        expires=datetime.now() + timedelta(seconds=1))
+    pc.set(
+        'key2', 'value', persistent=True,
+        expires=datetime.now() + timedelta(seconds=1))
+    pc.set(
+        'key3', 'value', persistent=True,
+        expires=datetime.now() + timedelta(seconds=1))
+    pc.flush()
+
+    # Wait out our expiry
+    time.sleep(1.3)
+
+    # now initialize our storage again
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # This triggers our __load_cache() which reads in a value
+    # determined to have already been expired
+    assert 'key1' not in pc
+    assert 'key2' not in pc
+    assert 'key3' not in pc
+
+    # Sweep
+    pc.delete()
+    pc.set('key', 'value')
+    pc.set('key2', 'value2')
+    pc.write('more-content')
+    # Flush our content to disk
+    pc.flush()
+
+    # Ideally we'd use os.stat below, but it is called inside a list
+    # comprehension block and mock doesn't appear to throw the exception
+    # there.  So this is a bit of a cheat, but it works
+    with mock.patch('builtins.sum', side_effect=OSError()):
+        assert pc.size(exclude=True, lazy=False) == 0
+        assert pc.size(exclude=False, lazy=False) == 0
+
+    pc = PersistentStore(namespace=namespace, path=str(tmpdir))
+    with mock.patch('glob.glob', side_effect=OSError()):
+        assert pc.files(exclude=True, lazy=False) == []
+        assert pc.files(exclude=False, lazy=False) == []
+
+    pc = PersistentStore(
+        namespace=namespace, path=str(tmpdir),
+        mode=PersistentStoreMode.FLUSH)
+
+    # Causes an initialization
+    pc['abc'] = 1
+    with mock.patch('os.unlink', side_effect=OSError()):
+        # Now we can't set data
+        with pytest.raises(KeyError):
+            pc['new-key'] = 'value'
+        # However keys that alrady exist don't get caught in check
+        # and therefore won't throw
+        pc['abc'] = 'value'
+
+    #
+    # Handles flush() when the queue is empty
+    #
+    pc.clear()
+    with mock.patch('os.unlink', side_effect=OSError()):
+        # We can't remove backup cache file
+        assert pc.flush(force=True) is False
+
+    with mock.patch('os.unlink', side_effect=FileNotFoundError()):
+        # FileNotFound is not an issue
+        assert pc.flush(force=True) is True
+
+    with mock.patch('os.rename', side_effect=OSError()):
+        # We can't create a backup
+        assert pc.flush(force=True) is False
+
+    with mock.patch('os.rename', side_effect=FileNotFoundError()):
+        # FileNotFound is not an issue
+        assert pc.flush(force=True) is True
+
+    # Flush any previous cache and data
+    pc.delete()
+
+    #
+    # Handles flush() cases where is data to write
+    #
+
+    # Create a key
+    pc.set('abc', 'a-test-value')
+    with mock.patch(
+            'os.unlink', side_effect=(OSError(), None)):
+        # We failed to move our content in place
+        assert pc.flush(force=True) is False
+
+    with mock.patch(
+            'os.unlink', side_effect=(OSError(), FileNotFoundError())):
+        # We failed to move our content in place
+        assert pc.flush(force=True) is False
+
+    with mock.patch(
+            'os.unlink', side_effect=(OSError(), OSError())):
+        # We failed to move our content in place
+        assert pc.flush(force=True) is False
+
+
+def test_persistent_custom_io(tmpdir):
+    """
+    Test reading and writing custom files
+    """
+
+    # Initialize it for memory only
+    pc = PersistentStore(path=str(tmpdir))
+
+    with pytest.raises(AttributeError):
+        pc.open('!invalid#-Key')
+
+    # We can't open the file as it does not exist
+    with pytest.raises(FileNotFoundError):
+        pc.open('valid-key')
+
+    with pytest.raises(AttributeError):
+        # Bad data
+        pc.open(1234)
+
+    with pytest.raises(FileNotFoundError):
+        with pc.open('key') as fd:
+            pass
+
+    # Also can be caught using Apprise Exception Handling
+    with pytest.raises(exception.AppriseFileNotFound):
+        with pc.open('key') as fd:
+            pass
+
+    # Write some valid data
+    with pc.open('new-key', 'wb') as fd:
+        fd.write(b'data')
+
+    with mock.patch("builtins.open", new_callable=mock.mock_open,
+                    read_data="mocked file content") as mock_file:
+        mock_file.side_effect = OSError
+        with pytest.raises(exception.AppriseDiskIOError):
+            with pc.open('new-key', compress=False) as fd:
+                pass
+
+    # Again but with compression this time
+    with mock.patch("gzip.open", new_callable=mock.mock_open,
+                    read_data="mocked file content") as mock_file:
+        mock_file.side_effect = OSError
+        with pytest.raises(exception.AppriseDiskIOError):
+            with pc.open('new-key', compress=True) as fd:
+                pass
+
+    # Zlib error handling as well during open
+    with mock.patch("gzip.open", new_callable=mock.mock_open,
+                    read_data="mocked file content") as mock_file:
+        mock_file.side_effect = zlib.error
+        with pytest.raises(exception.AppriseDiskIOError):
+            with pc.open('new-key', compress=True) as fd:
+                pass
+
+    # Writing
+    with pytest.raises(AttributeError):
+        pc.write(1234)
+
+    with pytest.raises(AttributeError):
+        pc.write(None)
+
+    with pytest.raises(AttributeError):
+        pc.write(True)
+
+    pc = PersistentStore(str(tmpdir))
+    with pc.open('key', 'wb') as fd:
+        fd.write(b'test')
+        fd.close()
+
+    # Handle error capuring when failing to write to disk
+    with mock.patch("gzip.open", new_callable=mock.mock_open,
+                    read_data="mocked file content") as mock_file:
+        mock_file.side_effect = zlib.error
+
+        # We fail to write to disk
+        assert pc.write(b'test') is False
+
+        # We support other errors too
+        mock_file.side_effect = OSError
+        assert pc.write(b'test') is False
+
+    with pytest.raises(AttributeError):
+        pc.write(b'data', key='!invalid#-Key')
+
+    pc.delete()
+    with mock.patch('os.unlink', side_effect=OSError()):
+        # Write our data and the __move() will fail under the hood
+        assert pc.write(b'test') is False
+
+    pc.delete()
+    with mock.patch('os.rename', side_effect=OSError()):
+        # Write our data and the __move() will fail under the hood
+        assert pc.write(b'test') is False
+
+    pc.delete()
+    with mock.patch('os.unlink', side_effect=(OSError(), FileNotFoundError())):
+        # Write our data and the __move() will fail under the hood
+        assert pc.write(b'test') is False
+
+    pc.delete()
+    with mock.patch('os.unlink', side_effect=(OSError(), None)):
+        # Write our data and the __move() will fail under the hood
+        assert pc.write(b'test') is False
+
+    pc.delete()
+    with mock.patch('os.unlink', side_effect=(OSError(), OSError())):
+        # Write our data and the __move() will fail under the hood
+        assert pc.write(b'test') is False
+
+    pc.delete()
+    with mock.patch('os.rename', side_effect=(None, OSError(), None)):
+        assert pc.write(b'test') is False
+
+    with mock.patch('os.rename', side_effect=(None, OSError(), OSError())):
+        assert pc.write(b'test') is False
+
+    with mock.patch('os.rename', side_effect=(
+            None, OSError(), FileNotFoundError())):
+        assert pc.write(b'test') is False
+
+    pc.delete()
+    with mock.patch('os.rename', side_effect=(None, None, None, OSError())):
+        # not enough reason to fail
+        assert pc.write(b'test') is True
+
+    with mock.patch('os.stat', side_effect=OSError()):
+        with mock.patch('os.close', side_effect=(None, OSError())):
+            assert pc.write(b'test') is False
+
+    pc.delete()
+    with mock.patch(
+            'tempfile._TemporaryFileWrapper.close', side_effect=OSError()):
+        assert pc.write(b'test') is False
+
+    pc.delete()
+    with mock.patch(
+            'tempfile._TemporaryFileWrapper.close',
+            side_effect=(OSError(), None)):
+        with mock.patch('os.unlink', side_effect=OSError()):
+            assert pc.write(b'test') is False
+
+    pc.delete()
+    with mock.patch(
+            'tempfile._TemporaryFileWrapper.close',
+            side_effect=(OSError(), None)):
+        with mock.patch('os.unlink', side_effect=FileNotFoundError()):
+            assert pc.write(b'test') is False
+
+
+def test_persistent_storage_cache_object(tmpdir):
+    """
+    General testing of a CacheObject
+    """
+    # A cache object
+    c = CacheObject(123)
+
+    ref = datetime.now(tz=timezone.utc)
+    expires = ref + timedelta(days=1)
+    # Create a cache object that expires tomorrow
+    c = CacheObject('abcd', expires=expires)
+    assert c.expires == expires
+    assert c.expires_sec > 86390.0 and c.expires_sec <= 86400.0
+    assert bool(c) is True
+    assert 'never' not in str(c)
+    assert 'str:+:abcd' in str(c)
+
+    #
+    # Testing CacheObject.set()
+    #
+    c.set(123)
+    assert 'never' not in str(c)
+    assert 'int:+:123' in str(c)
+    hash_value = c.hash()
+    assert isinstance(hash_value, str)
+
+    c.set(124)
+    assert 'never' not in str(c)
+    assert 'int:+:124' in str(c)
+    assert c.hash() != hash_value
+
+    c.set(123)
+    # sha is the same again if we set the value back
+    assert c.hash() == hash_value
+
+    c.set(124)
+    assert isinstance(c.hash(), str)
+    assert c.value == 124
+    assert bool(c) is True
+    c.set(124, expires=False, persistent=False)
+    assert bool(c) is True
+    assert c.expires is None
+    assert c.expires_sec is None
+    c.set(124, expires=True)
+    # we're expired now
+    assert bool(c) is False
+
+    #
+    # Testing CacheObject equality (==)
+    #
+    a = CacheObject('abc')
+    b = CacheObject('abc')
+
+    assert a == b
+    assert a == 'abc'
+    assert b == 'abc'
+
+    # Equality is no longer a thing
+    b = CacheObject('abc', 30)
+    assert a != b
+    # however we can look at the value inside
+    assert a == b.value
+
+    b = CacheObject('abc', persistent=False)
+    a = CacheObject('abc', persistent=True)
+    # Persistent flag matters
+    assert a != b
+    # however we can look at the value inside
+    assert a == b.value
+    b = CacheObject('abc', persistent=True)
+    assert a == b
+
+    # Epoch
+    EPOCH = datetime(1970, 1, 1)
+
+    # test all of our supported types (also test time naive and aware times)
+    for entry in ('string', 123, 1.2222, datetime.now(),
+                  datetime.now(tz=timezone.utc), None, False, True, b'\0'):
+        # Create a cache object that expires tomorrow
+        c = CacheObject(entry, datetime.now() + timedelta(days=1))
+
+        # Verify our content hasn't expired
+        assert c
+
+        # Verify we can dump our object
+        result = json.loads(json.dumps(
+            c, separators=(',', ':'), cls=CacheJSONEncoder))
+
+        # Instantiate our object
+        cc = CacheObject.instantiate(result)
+        assert cc.json() == c.json()
+
+    # Test our JSON Encoder against items we don't support
+    with pytest.raises(TypeError):
+        json.loads(json.dumps(
+            object(), separators=(',', ':'), cls=CacheJSONEncoder))
+
+    assert CacheObject.instantiate(None) is None
+    assert CacheObject.instantiate({}) is None
+
+    # Bad data
+    assert CacheObject.instantiate({
+        'v': 123,
+        'x': datetime.now(),
+        'c': 'int'}) is None
+
+    # object type is not supported
+    assert CacheObject.instantiate({
+        'v': 123,
+        'x': (datetime.now() - EPOCH).total_seconds(),
+        'c': object}) is None
+
+    obj = CacheObject.instantiate({
+        'v': 123,
+        'x': (datetime.now() - EPOCH).total_seconds(),
+        'c': 'int'}, verify=False)
+    assert isinstance(obj, CacheObject)
+    assert obj.value == 123
+
+    # no HASH and verify is set to true; our checksum will fail
+    assert CacheObject.instantiate({
+        'v': 123,
+        'x': (datetime.now() - EPOCH).total_seconds(),
+        'c': 'int'}, verify=True) is None
+
+    # We can't instantiate our object if the expiry value is bad
+    assert CacheObject.instantiate({
+        'v': 123,
+        'x': 'garbage',
+        'c': 'int'}, verify=False) is None
+
+    # We need a valid hash sum too
+    assert CacheObject.instantiate({
+        'v': 123,
+        'x': (datetime.now() - EPOCH).total_seconds(),
+        'c': 'int',
+        # Expecting a valid sha string
+        '!': 1.0}, verify=False) is None
+
+    # Our Bytes Object with corruption
+    assert CacheObject.instantiate({
+        'v': 'garbage',
+        'x': (datetime.now() - EPOCH).total_seconds(),
+        'c': 'bytes'}, verify=False) is None
+
+    obj = CacheObject.instantiate({
+        'v': 'AA==',
+        'x': (datetime.now() - EPOCH).total_seconds(),
+        'c': 'bytes'}, verify=False)
+    assert isinstance(obj, CacheObject)
+    assert obj.value == b'\0'
+
+    # Test our datetime objects
+    obj = CacheObject.instantiate({
+        'v': '2024-06-08T01:50:01.587267',
+        'x': (datetime.now() - EPOCH).total_seconds(),
+        'c': 'datetime'}, verify=False)
+    assert isinstance(obj, CacheObject)
+    assert obj.value == datetime(2024, 6, 8, 1, 50, 1, 587267)
+
+    # A corrupt datetime object
+    assert CacheObject.instantiate({
+        'v': 'garbage',
+        'x': (datetime.now() - EPOCH).total_seconds(),
+        'c': 'datetime'}, verify=False) is None
+
+
+def test_persistent_storage_disk_prune(tmpdir):
+    """
+    General testing of a Persistent Store prune calls
+    """
+
+    # Persistent Storage Initialization
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t01', mode=PersistentStoreMode.FLUSH)
+    # Store some data
+    assert pc.write(b'data-t01') is True
+    assert pc.set('key-t01', 'value')
+
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+    # Store some data
+    assert pc.write(b'data-t02') is True
+    assert pc.set('key-t02', 'value')
+
+    # purne anything older then 30s
+    results = PersistentStore.disk_prune(path=str(tmpdir), expires=30)
+    # Nothing is older then 30s right now
+    assert isinstance(results, dict)
+    assert 't01' in results
+    assert 't02' in results
+    assert len(results['t01']) == 0
+    assert len(results['t02']) == 0
+
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t01', mode=PersistentStoreMode.FLUSH)
+
+    # Nothing is pruned
+    assert pc.get('key-t01') == 'value'
+    assert pc.read() == b'data-t01'
+
+    # An expiry of zero gets everything
+    results = PersistentStore.disk_prune(path=str(tmpdir), expires=0)
+    # We match everything now
+    assert isinstance(results, dict)
+    assert 't01' in results
+    assert 't02' in results
+    assert len(results['t01']) == 2
+    assert len(results['t02']) == 2
+
+    # Content is still not removed however because no action was put in place
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+    # Nothing is pruned
+    assert pc.get('key-t02') == 'value'
+    assert pc.read() == b'data-t02'
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t01', mode=PersistentStoreMode.FLUSH)
+    # Nothing is pruned
+    assert pc.get('key-t01') == 'value'
+    assert pc.read() == b'data-t01'
+
+    with mock.patch('os.listdir', side_effect=OSError()):
+        results = PersistentStore.disk_scan(
+            namespace='t01', path=str(tmpdir), closest=True)
+        assert isinstance(results, list)
+        assert len(results) == 0
+
+    with mock.patch('os.listdir', side_effect=FileNotFoundError()):
+        results = PersistentStore.disk_scan(
+            namespace='t01', path=str(tmpdir), closest=True)
+        assert isinstance(results, list)
+        assert len(results) == 0
+
+        # Without closest flag
+        results = PersistentStore.disk_scan(
+            namespace='t01', path=str(tmpdir), closest=False)
+        assert isinstance(results, list)
+        assert len(results) == 0
+
+    # Now we'll filter on specific namespaces
+    results = PersistentStore.disk_prune(
+        namespace='notfound', path=str(tmpdir), expires=0, action=True)
+
+    # nothing matched, nothing found
+    assert isinstance(results, dict)
+    assert len(results) == 0
+
+    results = PersistentStore.disk_prune(
+        namespace=('t01', 'invalid', '-garbag!'),
+        path=str(tmpdir), expires=0, action=True)
+
+    # only t01 would be cleaned now
+    assert isinstance(results, dict)
+    assert len(results) == 1
+    assert len(results['t01']) == 2
+
+    # A second call will yield no results because the content has
+    # already been cleaned up
+    results = PersistentStore.disk_prune(
+        namespace='t01',
+        path=str(tmpdir), expires=0, action=True)
+    assert isinstance(results, dict)
+    assert len(results) == 0
+
+    # t02 is still untouched
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+    # Nothing is pruned
+    assert pc.get('key-t02') == 'value'
+    assert pc.read() == b'data-t02'
+
+    # t01 of course... it's gone
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t01', mode=PersistentStoreMode.FLUSH)
+    # Nothing is pruned
+    assert pc.get('key-t01') is None
+    assert pc.read() is None
+
+    with pytest.raises(AttributeError):
+        # provide garbage in namespace field and we're going to have a problem
+        PersistentStore.disk_prune(
+            namespace=object, path=str(tmpdir), expires=0, action=True)
+
+    # Error Handling
+    with mock.patch('os.path.getmtime', side_effect=FileNotFoundError()):
+        results = PersistentStore.disk_prune(
+            namespace='t02', path=str(tmpdir), expires=0, action=True)
+        assert isinstance(results, dict)
+        assert len(results) == 1
+        assert len(results['t02']) == 0
+
+        # no files were removed, so our data is still accessible
+        pc = PersistentStore(
+            path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+        # Nothing is pruned
+        assert pc.get('key-t02') == 'value'
+        assert pc.read() == b'data-t02'
+
+    with mock.patch('os.path.getmtime', side_effect=OSError()):
+        results = PersistentStore.disk_prune(
+            namespace='t02', path=str(tmpdir), expires=0, action=True)
+        assert isinstance(results, dict)
+        assert len(results) == 1
+        assert len(results['t02']) == 0
+
+        # no files were removed, so our data is still accessible
+        pc = PersistentStore(
+            path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+        # Nothing is pruned
+        assert pc.get('key-t02') == 'value'
+        assert pc.read() == b'data-t02'
+
+    with mock.patch('os.unlink', side_effect=FileNotFoundError()):
+        results = PersistentStore.disk_prune(
+            namespace='t02', path=str(tmpdir), expires=0, action=True)
+        assert isinstance(results, dict)
+        assert len(results) == 1
+        assert len(results['t02']) == 2
+
+        # no files were removed, so our data is still accessible
+        pc = PersistentStore(
+            path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+        # Nothing is pruned
+        assert pc.get('key-t02') == 'value'
+        assert pc.read() == b'data-t02'
+
+    with mock.patch('os.unlink', side_effect=OSError()):
+        results = PersistentStore.disk_prune(
+            namespace='t02', path=str(tmpdir), expires=0, action=True)
+        assert isinstance(results, dict)
+        assert len(results) == 1
+        assert len(results['t02']) == 2
+
+        # no files were removed, so our data is still accessible
+        pc = PersistentStore(
+            path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+        # Nothing is pruned
+        assert pc.get('key-t02') == 'value'
+        assert pc.read() == b'data-t02'
+
+    with mock.patch('os.rmdir', side_effect=OSError()):
+        results = PersistentStore.disk_prune(
+            namespace='t02', path=str(tmpdir), expires=0, action=True)
+        assert isinstance(results, dict)
+        assert len(results) == 1
+        assert len(results['t02']) == 2
+
+        # no files were removed, so our data is still accessible
+        pc = PersistentStore(
+            path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+        # Nothing is pruned
+        assert pc.get('key-t02') is None
+        assert pc.read() is None
+
+
+def test_persistent_storage_disk_changes(tmpdir):
+    """
+    General testing of a Persistent Store with underlining disk changes
+    """
+
+    # Create a garbage file in place of where the namespace should be
+    tmpdir.join('t01').write('0' * 1024)
+
+    # Persistent Storage Initialization where namespace directory now is
+    # already occupied by a filename
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t01', mode=PersistentStoreMode.FLUSH)
+
+    # Store some data and note that it isn't possible
+    assert pc.write(b'data-t01') is False
+    # We actually fell back to memory mode:
+    assert pc.mode == PersistentStoreMode.MEMORY
+
+    # Set's work
+    assert pc.set('key-t01', 'value')
+
+    # But upon reinitializtion (enforcing memory mode check) we will not have
+    # the data available to us
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t01', mode=PersistentStoreMode.FLUSH)
+
+    assert pc.get('key-t01') is None
+
+    #
+    # Test situation where the file structure changed after initialization
+    #
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+    # Our mode stuck as t02 initialized correctly
+    assert pc.mode == PersistentStoreMode.FLUSH
+    assert os.path.isdir(pc.path)
+
+    shutil.rmtree(pc.path)
+    assert not os.path.isdir(pc.path)
+    assert pc.set('key-t02', 'value')
+    # The directory got re-created
+    assert os.path.isdir(pc.path)
+
+    # Same test but flag set to AUTO
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.AUTO)
+    # Our mode stuck as t02 initialized correctly
+    assert pc.mode == PersistentStoreMode.AUTO
+    assert os.path.isdir(pc.path)
+
+    shutil.rmtree(pc.path)
+    assert not os.path.isdir(pc.path)
+    assert pc.set('key-t02', 'value')
+    # The directory is not recreated because of auto; it will occur on save
+    assert not os.path.isdir(pc.path)
+    path = pc.path
+    del pc
+    # It exists now
+    assert os.path.isdir(path)
+
+    pc = PersistentStore(
+        path=str(tmpdir), namespace='t02', mode=PersistentStoreMode.FLUSH)
+    # Content was not lost
+    assert pc.get('key-t02') == 'value'
+
+    # We'll remove a sub directory of it this time
+    shutil.rmtree(os.path.join(pc.path, pc.temp_dir))
+
+    # We will still successfully write our data
+    assert pc.write(b'data-t02') is True
+    assert os.path.isdir(pc.path)
+
+    shutil.rmtree(pc.path)
+    assert not os.path.isdir(pc.path)
+    assert pc.set('key-t01', 'value')
diff --git a/test/test_plugin_aprs.py b/test/test_plugin_aprs.py
index 2399c956..5c1c7ff8 100644
--- a/test/test_plugin_aprs.py
+++ b/test/test_plugin_aprs.py
@@ -190,6 +190,9 @@ def test_plugin_aprs_edge_cases(mock_create_connection):
         "aprs://DF1JSL-15:12345@DF1ABC/DF1DEF")
     assert isinstance(instance, NotifyAprs)
 
+    # our URL Identifier
+    assert isinstance(instance.url_id(), str)
+
     # Objects read
     assert len(instance) == 2
 
diff --git a/test/test_plugin_glib.py b/test/test_plugin_dbus.py
similarity index 99%
rename from test/test_plugin_glib.py
rename to test/test_plugin_dbus.py
index ecada38e..5a491c09 100644
--- a/test/test_plugin_glib.py
+++ b/test/test_plugin_dbus.py
@@ -194,6 +194,9 @@ def test_plugin_dbus_general_success(mocker, dbus_glib_environment):
     assert obj.url().startswith('dbus://_/')
     assert re.search('image=yes', obj.url())
 
+    # URL ID Generation is disabled
+    assert obj.url_id() is None
+
     assert obj.notify(
         title='title', body='body',
         notify_type=apprise.NotifyType.INFO) is True
diff --git a/test/test_plugin_email.py b/test/test_plugin_email.py
index 1718c799..59b4d384 100644
--- a/test/test_plugin_email.py
+++ b/test/test_plugin_email.py
@@ -348,6 +348,9 @@ def test_plugin_email(mock_smtp, mock_smtpssl):
                 # We loaded okay; now lets make sure we can reverse this url
                 assert isinstance(obj.url(), str)
 
+                # Get our URL Identifier
+                assert isinstance(obj.url_id(), str)
+
                 # Verify we can acquire a target count as an integer
                 assert isinstance(len(obj), int)
 
diff --git a/test/test_plugin_gnome.py b/test/test_plugin_gnome.py
index eff460ea..cab1725a 100644
--- a/test/test_plugin_gnome.py
+++ b/test/test_plugin_gnome.py
@@ -136,6 +136,9 @@ def test_plugin_gnome_general_success(obj):
     # Test url() call
     assert isinstance(obj.url(), str) is True
 
+    # our URL Identifier is disabled
+    assert obj.url_id() is None
+
     # test notifications
     assert obj.notify(title='title', body='body',
                       notify_type=apprise.NotifyType.INFO) is True
diff --git a/test/test_plugin_growl.py b/test/test_plugin_growl.py
index 25886d0c..604fa2c3 100644
--- a/test/test_plugin_growl.py
+++ b/test/test_plugin_growl.py
@@ -273,6 +273,9 @@ def test_plugin_growl_general(mock_gntp):
 
             assert isinstance(obj, instance) is True
 
+            # Test our URL Identifier is generated
+            assert isinstance(obj.url_id(), str) is True
+
             if isinstance(obj, NotifyBase):
                 # We loaded okay; now lets make sure we can reverse this url
                 assert isinstance(obj.url(), str) is True
diff --git a/test/test_plugin_macosx.py b/test/test_plugin_macosx.py
index d2cd44b5..e3c002b3 100644
--- a/test/test_plugin_macosx.py
+++ b/test/test_plugin_macosx.py
@@ -100,6 +100,11 @@ def test_plugin_macosx_general_success(macos_notify_environment):
     # Test url() call
     assert isinstance(obj.url(), str) is True
 
+    # URL Identifier has been disabled as this isn't unique enough
+    # to be mapped to more the 1 end point; verify that None is always
+    # returned
+    assert obj.url_id() is None
+
     # test notifications
     assert obj.notify(title='title', body='body',
                       notify_type=apprise.NotifyType.INFO) is True
diff --git a/test/test_plugin_matrix.py b/test/test_plugin_matrix.py
index c9eb6ced..1473887a 100644
--- a/test/test_plugin_matrix.py
+++ b/test/test_plugin_matrix.py
@@ -30,7 +30,8 @@ from unittest import mock
 import os
 import requests
 import pytest
-from apprise import Apprise, AppriseAsset, AppriseAttachment, NotifyType
+from apprise import (
+    Apprise, AppriseAsset, AppriseAttachment, NotifyType, PersistentStoreMode)
 from json import dumps
 
 from apprise.plugins.matrix import NotifyMatrix
@@ -646,49 +647,56 @@ def test_plugin_matrix_rooms(mock_post, mock_get, mock_put):
 
     assert obj._room_join('!abc123') == response_obj['room_id']
     # Use cache to get same results
-    assert len(obj._room_cache) == 1
+    assert obj.store.get('!abc123') is None
+    # However this is how the cache entry gets stored
+    assert obj.store.get('!abc123:localhost') is not None
+    assert obj.store.get('!abc123:localhost')['id'] == response_obj['room_id']
     assert obj._room_join('!abc123') == response_obj['room_id']
 
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('!abc123:localhost') == response_obj['room_id']
+    assert obj.store.get('!abc123:localhost') is not None
+    assert obj.store.get('!abc123:localhost')['id'] == response_obj['room_id']
     # Use cache to get same results
-    assert len(obj._room_cache) == 1
     assert obj._room_join('!abc123:localhost') == response_obj['room_id']
 
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('abc123') == response_obj['room_id']
     # Use cache to get same results
-    assert len(obj._room_cache) == 1
+    assert obj.store.get('#abc123:localhost') is not None
+    assert obj.store.get('#abc123:localhost')['id'] == response_obj['room_id']
     assert obj._room_join('abc123') == response_obj['room_id']
 
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('abc123:localhost') == response_obj['room_id']
     # Use cache to get same results
-    assert len(obj._room_cache) == 1
+    assert obj.store.get('#abc123:localhost') is not None
+    assert obj.store.get('#abc123:localhost')['id'] == response_obj['room_id']
     assert obj._room_join('abc123:localhost') == response_obj['room_id']
 
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('#abc123:localhost') == response_obj['room_id']
     # Use cache to get same results
-    assert len(obj._room_cache) == 1
+    assert obj.store.get('#abc123:localhost') is not None
+    assert obj.store.get('#abc123:localhost')['id'] == response_obj['room_id']
     assert obj._room_join('#abc123:localhost') == response_obj['room_id']
 
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('%') is None
     assert obj._room_join(None) is None
 
     # 403 response; this will push for a room creation for alias based rooms
     # and these will fail
     request.status_code = 403
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('!abc123') is None
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('!abc123:localhost') is None
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('abc123') is None
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('abc123:localhost') is None
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_join('#abc123:localhost') is None
     del obj
 
@@ -707,24 +715,24 @@ def test_plugin_matrix_rooms(mock_post, mock_get, mock_put):
     # You can't add room_id's, they must be aliases
     assert obj._room_create('!abc123') is None
     assert obj._room_create('!abc123:localhost') is None
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_create('abc123') == response_obj['room_id']
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_create('abc123:localhost') == response_obj['room_id']
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_create('#abc123:localhost') == response_obj['room_id']
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_create('%') is None
     assert obj._room_create(None) is None
 
     # 403 response; this will push for a room creation for alias based rooms
     # and these will fail
     request.status_code = 403
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_create('abc123') is None
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_create('abc123:localhost') is None
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_create('#abc123:localhost') is None
 
     request.status_code = 403
@@ -732,7 +740,7 @@ def test_plugin_matrix_rooms(mock_post, mock_get, mock_put):
         u'errcode': u'M_ROOM_IN_USE',
         u'error': u'Room alias already taken',
     })
-    obj._room_cache = {}
+    obj.store.clear()
     # This causes us to look up a channel ID if we get a ROOM_IN_USE response
     assert obj._room_create('#abc123:localhost') is None
     del obj
@@ -780,19 +788,19 @@ def test_plugin_matrix_rooms(mock_post, mock_get, mock_put):
     # You can't add room_id's, they must be aliases
     assert obj._room_id('!abc123') is None
     assert obj._room_id('!abc123:localhost') is None
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_id('abc123') == response_obj['room_id']
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_id('abc123:localhost') == response_obj['room_id']
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_id('#abc123:localhost') == response_obj['room_id']
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_id('%') is None
     assert obj._room_id(None) is None
 
     # If we can't look the code up, we return None
     request.status_code = 403
-    obj._room_cache = {}
+    obj.store.clear()
     assert obj._room_id('#abc123:localhost') is None
 
     # Force a object removal (thus a logout call)
@@ -1164,7 +1172,8 @@ def test_plugin_matrix_attachments_api_v2(mock_post, mock_get):
 @mock.patch('requests.put')
 @mock.patch('requests.get')
 @mock.patch('requests.post')
-def test_plugin_matrix_transaction_ids_api_v3(mock_post, mock_get, mock_put):
+def test_plugin_matrix_transaction_ids_api_v3_no_cache(
+        mock_post, mock_get, mock_put):
     """
     NotifyMatrix() Transaction ID Checks (v3)
 
@@ -1184,12 +1193,17 @@ def test_plugin_matrix_transaction_ids_api_v3(mock_post, mock_get, mock_put):
     mock_get.return_value = response
     mock_put.return_value = response
 
+    # For each element is 1 batch that is ran
+    # the number defined is the number of notifications to send
     batch = [10, 1, 5]
 
     for notifications in batch:
         # Instantiate our object
         obj = Apprise.instantiate('matrix://user:pass@localhost/#general?v=3')
 
+        # Ensure mode is flush
+        assert obj.store.mode == PersistentStoreMode.MEMORY
+
         # Performs a login
         assert obj.notify(
             body='body', title='title', notify_type=NotifyType.INFO
@@ -1235,3 +1249,105 @@ def test_plugin_matrix_transaction_ids_api_v3(mock_post, mock_get, mock_put):
             'http://localhost/_matrix/client/v3/logout'
         mock_post.reset_mock()
         assert mock_put.call_count == 0
+
+
+@mock.patch('requests.put')
+@mock.patch('requests.get')
+@mock.patch('requests.post')
+def test_plugin_matrix_transaction_ids_api_v3_w_cache(
+        mock_post, mock_get, mock_put, tmpdir):
+    """
+    NotifyMatrix() Transaction ID Checks (v3)
+
+    """
+
+    # Prepare a good response
+    response = mock.Mock()
+    response.status_code = requests.codes.ok
+    response.content = MATRIX_GOOD_RESPONSE.encode('utf-8')
+
+    # Prepare a bad response
+    bad_response = mock.Mock()
+    bad_response.status_code = requests.codes.internal_server_error
+
+    # Prepare Mock return object
+    mock_post.return_value = response
+    mock_get.return_value = response
+    mock_put.return_value = response
+
+    # For each element is 1 batch that is ran
+    # the number defined is the number of notifications to send
+    batch = [10, 1, 5]
+
+    mock_post.reset_mock()
+    mock_get.reset_mock()
+    mock_put.reset_mock()
+
+    asset = AppriseAsset(
+        storage_mode=PersistentStoreMode.FLUSH,
+        storage_path=str(tmpdir),
+    )
+
+    # Message Counter
+    transaction_id = 1
+
+    for no, notifications in enumerate(batch):
+        # Instantiate our object
+        obj = Apprise.instantiate(
+            'matrix://user:pass@localhost/#general?v=3', asset=asset)
+
+        # Ensure mode is flush
+        assert obj.store.mode == PersistentStoreMode.FLUSH
+
+        # Performs a login
+        assert obj.notify(
+            body='body', title='title', notify_type=NotifyType.INFO
+        ) is True
+        assert mock_get.call_count == 0
+        if no == 0:
+            # first entry
+            assert mock_post.call_count == 2
+            assert mock_post.call_args_list[0][0][0] == \
+                'http://localhost/_matrix/client/v3/login'
+            assert mock_post.call_args_list[1][0][0] == \
+                'http://localhost/_matrix/client/v3/' \
+                'join/%23general%3Alocalhost'
+            assert mock_put.call_count == 1
+            assert mock_put.call_args_list[0][0][0] == \
+                'http://localhost/_matrix/client/v3/rooms/' + \
+                '%21abc123%3Alocalhost/send/m.room.message/0'
+
+        for no, _ in enumerate(range(notifications), start=transaction_id):
+            # Clean our slate
+            mock_post.reset_mock()
+            mock_get.reset_mock()
+            mock_put.reset_mock()
+
+            assert obj.notify(
+                body='body', title='title', notify_type=NotifyType.INFO
+            ) is True
+
+            # Increment transaction counter
+            transaction_id += 1
+
+            assert mock_get.call_count == 0
+            assert mock_post.call_count == 0
+            assert mock_put.call_count == 1
+            assert mock_put.call_args_list[0][0][0] == \
+                'http://localhost/_matrix/client/v3/rooms/' + \
+                f'%21abc123%3Alocalhost/send/m.room.message/{no}'
+
+        # Increment transaction counter
+        transaction_id += 1
+
+        mock_post.reset_mock()
+        mock_get.reset_mock()
+        mock_put.reset_mock()
+
+        # Force a object removal
+        # Biggest takeaway is that a logout no longer happens
+        del obj
+
+        assert mock_get.call_count == 0
+        assert mock_post.call_count == 0
+        assert mock_put.call_count == 0
diff --git a/test/test_plugin_mqtt.py b/test/test_plugin_mqtt.py
index 3f748606..3948b95b 100644
--- a/test/test_plugin_mqtt.py
+++ b/test/test_plugin_mqtt.py
@@ -96,6 +96,9 @@ def test_plugin_mqtt_default_success(mqtt_client_mock):
     assert len(obj) == 1
     assert obj.url().startswith('mqtt://localhost:1234/my/topic')
 
+    # Genrate the URL Identifier
+    assert isinstance(obj.url_id(), str)
+
     # Verify default settings.
     assert re.search(r'qos=0', obj.url())
     assert re.search(r'version=v3.1.1', obj.url())
diff --git a/test/test_plugin_notifiarr.py b/test/test_plugin_notifiarr.py
index cec36e10..2e5355d9 100644
--- a/test/test_plugin_notifiarr.py
+++ b/test/test_plugin_notifiarr.py
@@ -85,11 +85,23 @@ apprise_url_tests = (
         # Our expected url(privacy=True) startswith() response:
         'privacy_url': 'notifiarr://a...y/#123/#432',
     }),
+    ('notifiarr://apikey/?to=123,432&event=1234', {
+        # Test event
+        'instance': NotifyNotifiarr,
+        # Our expected url(privacy=True) startswith() response:
+        'privacy_url': 'notifiarr://a...y/#123/#432',
+    }),
     ('notifiarr://123/?apikey=myapikey', {
         'instance': NotifyNotifiarr,
         # Our expected url(privacy=True) startswith() response:
         'privacy_url': 'notifiarr://m...y/#123',
     }),
+    ('notifiarr://123/?key=myapikey', {
+        # Support key=
+        'instance': NotifyNotifiarr,
+        # Our expected url(privacy=True) startswith() response:
+        'privacy_url': 'notifiarr://m...y/#123',
+    }),
     ('notifiarr://123/?apikey=myapikey&image=yes', {
         'instance': NotifyNotifiarr,
     }),
diff --git a/test/test_plugin_ntfy.py b/test/test_plugin_ntfy.py
index 30d0eacb..e05da251 100644
--- a/test/test_plugin_ntfy.py
+++ b/test/test_plugin_ntfy.py
@@ -573,6 +573,15 @@ def test_plugin_ntfy_config_files(mock_post, mock_get):
     assert next(aobj.find(tag='ntfy_invalid')).priority == \
         NtfyPriority.NORMAL
 
+    # A cloud reference without any identifiers; the ntfy:// (insecure mode)
+    # is not considered during the id generation as ntfys:// is always
+    # implied
+    results = NotifyNtfy.parse_url('ntfy://')
+    obj = NotifyNtfy(**results)
+    new_results = NotifyNtfy.parse_url(obj.url())
+    obj2 = NotifyNtfy(**new_results)
+    assert obj.url_id() == obj2.url_id()
+
 
 @mock.patch('requests.post')
 def test_plugin_ntfy_message_to_attach(mock_post):
diff --git a/test/test_plugin_opsgenie.py b/test/test_plugin_opsgenie.py
index 7d8c6c58..fb692ac8 100644
--- a/test/test_plugin_opsgenie.py
+++ b/test/test_plugin_opsgenie.py
@@ -27,10 +27,11 @@
 # POSSIBILITY OF SUCH DAMAGE.
 
 from unittest import mock
-
+from json import dumps
 import requests
 import apprise
-from apprise.plugins.opsgenie import NotifyOpsgenie, OpsgeniePriority
+from apprise.plugins.opsgenie import (
+    NotifyType, NotifyOpsgenie, OpsgeniePriority)
 from helpers import AppriseURLTester
 
 # Disable logging for a cleaner testing output
@@ -40,6 +41,12 @@ logging.disable(logging.CRITICAL)
 # a test UUID we can use
 UUID4 = '8b799edf-6f98-4d3a-9be7-2862fb4e5752'
 
+OPSGENIE_GOOD_RESPONSE = dumps({
+    "result": "Request will be processed",
+    "took": 0.204,
+    "requestId": "43a29c5c-3dbf-4fa4-9c26-f4f71023e120"
+})
+
 # Our Testing URLs
 apprise_url_tests = (
     ('opsgenie://', {
@@ -58,49 +65,155 @@ apprise_url_tests = (
         # invalid region id
         'instance': TypeError,
     }),
+    ('opsgenie://user@apikey/', {
+        # No targets specified; this is allowed
+        'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.WARNING,
+        # Bad response returned
+        'requests_response_text': '{',
+        # We will not be successful sending the notice
+        'notify_response': False,
+    }),
     ('opsgenie://apikey/', {
         # No targets specified; this is allowed
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/user', {
         # Valid user
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
         'privacy_url': 'opsgenie://a...y/%40user',
     }),
     ('opsgenie://apikey/@user?region=eu', {
         # European Region
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/@user?entity=A%20Entity', {
         # Assign an entity
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/@user?alias=An%20Alias', {
         # Assign an alias
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
-    ('opsgenie://apikey/@user?priority=p3', {
+    # Bad Action
+    ('opsgenie://apikey/@user?action=invalid', {
+        # Assign an entity
+        'instance': TypeError,
+    }),
+    ('opsgenie://from@apikey/@user?:invalid=note', {
+        # Assign an entity
+        'instance': TypeError,
+    }),
+    ('opsgenie://apikey/@user?:warning=invalid', {
+        # Assign an entity
+        'instance': TypeError,
+    }),
+    # Creates an index entry
+    ('opsgenie://apikey/@user?entity=index&action=new', {
+        # Assign an entity
+        'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
+    }),
+    # Now action it
+    ('opsgenie://apikey/@user?entity=index&action=acknowledge', {
+        # Assign an entity
+        'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.SUCCESS,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
+    }),
+    ('opsgenie://from@apikey/@user?entity=index&action=note', {
+        # Assign an entity
+        'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.SUCCESS,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
+    }),
+    ('opsgenie://from@apikey/@user?entity=index&action=note', {
+        # Assign an entity
+        'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.SUCCESS,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
+        'response': False,
+        'requests_response_code': 500,
+    }),
+    ('opsgenie://apikey/@user?entity=index&action=close', {
+        # Assign an entity
+        'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.SUCCESS,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
+    }),
+    ('opsgenie://apikey/@user?entity=index&action=delete', {
+        # Assign an entity
+        'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.SUCCESS,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
+    }),
+    # map info messages to generate a new message
+    ('opsgenie://apikey/@user?entity=index2&:info=new', {
+        # Assign an entity
+        'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.INFO,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
+    }),
+    ('opsgenie://joe@apikey/@user?priority=p3', {
         # Assign our priority
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/?tags=comma,separated', {
         # Test our our 'tags' (tag is reserved in Apprise) but not 'tags'
         # Also test the fact we do not need to define a target
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/@user?priority=invalid', {
         # Invalid priority (loads using default)
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/user@email.com/#team/*sche/^esc/%20/a', {
         # Valid user (email), valid schedule, Escalated ID,
         # an invalid entry (%20), and too short of an entry (a)
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/@{}/#{}/*{}/^{}/'.format(
         UUID4, UUID4, UUID4, UUID4), {
         # similar to the above, except we use the UUID's
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     # Same link as before but @ missing at the front causing an ambigious
     # lookup however the entry is treated a though a @ was in front (user)
@@ -108,31 +221,52 @@ apprise_url_tests = (
         UUID4, UUID4, UUID4, UUID4), {
         # similar to the above, except we use the UUID's
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey?to=#team,user&+key=value&+type=override', {
         # Test to= and details (key/value pair) also override 'type'
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/#team/@user/?batch=yes', {
         # Test batch=
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/#team/@user/?batch=no', {
         # Test batch=
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://?apikey=abc&to=user', {
         # Test Kwargs
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
     }),
     ('opsgenie://apikey/#team/user/', {
         'instance': NotifyOpsgenie,
         # throw a bizzare code forcing us to fail to look it up
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
         'response': False,
         'requests_response_code': 999,
     }),
     ('opsgenie://apikey/#topic1/device/', {
         'instance': NotifyOpsgenie,
+        'notify_type': NotifyType.FAILURE,
+        # Our response expected server response
+        'requests_response_text': OPSGENIE_GOOD_RESPONSE,
         # Throws a series of connection and transfer exceptions when this flag
         # is set and tests that we gracfully handle them
         'test_requests_exceptions': True,
@@ -140,14 +274,14 @@ apprise_url_tests = (
 )
 
 
-def test_plugin_opsgenie_urls():
+def test_plugin_opsgenie_urls(tmpdir):
     """
     NotifyOpsgenie() Apprise URLs
 
     """
 
     # Run our general tests
-    AppriseURLTester(tests=apprise_url_tests).run_all()
+    AppriseURLTester(tests=apprise_url_tests).run_all(str(tmpdir))
 
 
 @mock.patch('requests.post')
@@ -185,6 +319,7 @@ def test_plugin_opsgenie_config_files(mock_post):
     # Prepare Mock
     mock_post.return_value = requests.Request()
     mock_post.return_value.status_code = requests.codes.ok
+    mock_post.return_value.content = OPSGENIE_GOOD_RESPONSE
 
     # Create ourselves a config object
     ac = apprise.AppriseConfig()
@@ -217,3 +352,38 @@ def test_plugin_opsgenie_config_files(mock_post):
     assert len([x for x in aobj.find(tag='opsgenie_invalid')]) == 1
     assert next(aobj.find(tag='opsgenie_invalid')).priority == \
         OpsgeniePriority.NORMAL
+
+
+@mock.patch('requests.post')
+def test_plugin_opsgenie_edge_case(mock_post):
+    """
+    NotifyOpsgenie() Edge Cases
+    """
+    # Prepare Mock
+    mock_post.return_value = requests.Request()
+    mock_post.return_value.status_code = requests.codes.ok
+    mock_post.return_value.content = OPSGENIE_GOOD_RESPONSE
+
+    instance = apprise.Apprise.instantiate('opsgenie://apikey')
+    assert isinstance(instance, NotifyOpsgenie)
+
+    assert len(instance.store.keys()) == 0
+    assert instance.notify('test', 'key', NotifyType.FAILURE) is True
+    assert len(instance.store.keys()) == 1
+
+    # Again just causes same index to get over-written
+    assert instance.notify('test', 'key', NotifyType.FAILURE) is True
+    assert len(instance.store.keys()) == 1
+    assert 'a62f2225bf' in instance.store
+
+    # Assign it garbage
+    instance.store['a62f2225bf'] = 'garbage'
+    # This causes an internal check to fail where the keys are expected to be
+    # as a list (this one is now a string)
+    # content self corrects and things are fine
+    assert instance.notify('test', 'key', NotifyType.FAILURE) is True
+    assert len(instance.store.keys()) == 1
+
+    # new key is new index
+    assert instance.notify('test', 'key2', NotifyType.FAILURE) is True
+    assert len(instance.store.keys()) == 2
diff --git a/test/test_plugin_rsyslog.py b/test/test_plugin_rsyslog.py
index 42ad3ca2..d2881904 100644
--- a/test/test_plugin_rsyslog.py
+++ b/test/test_plugin_rsyslog.py
@@ -137,6 +137,9 @@ def test_plugin_rsyslog_by_url(mock_getpid, mock_socket):
     assert obj.url().startswith('rsyslog://localhost:9000/daemon') is True
     assert re.search(r'logpid=no', obj.url()) is not None
 
+    # Verify our URL ID is generated
+    assert isinstance(obj.url_id(), str)
+
     # Test notifications
     # + 1 byte in size due to user
     # + length of pid returned
diff --git a/test/test_plugin_sfr.py b/test/test_plugin_sfr.py
index 7f052ecc..82430bdf 100644
--- a/test/test_plugin_sfr.py
+++ b/test/test_plugin_sfr.py
@@ -113,8 +113,7 @@ apprise_url_tests = (
         'privacy_url': (
             'sfr://service_id:****@0...0/0000000000?'
             'from=MyApp&timeout=30&voice=claire08s&'
-            'lang=fr_FR&media=SMSUnicode&format=text'
-            '&overflow=upstream&rto=4.0&cto=4.0&verify=yes'),
+            'lang=fr_FR&media=SMSUnicode'),
         # Our response expected server response
         'requests_response_text': SFR_GOOD_RESPONSE,
     }),
@@ -126,8 +125,7 @@ apprise_url_tests = (
         'privacy_url': (
             'sfr://service_id:****@0...0/0000000000?'
             'from=&timeout=2880&voice=laura8k&'
-            'lang=en_US&media=SMSUnicode&format=text'
-            '&overflow=upstream&rto=4.0&cto=4.0&verify=yes'),
+            'lang=en_US&media=SMSUnicode'),
         # Our response expected server response
         'requests_response_text': SFR_GOOD_RESPONSE,
     }),
@@ -139,8 +137,7 @@ apprise_url_tests = (
         'privacy_url': (
             'sfr://service_id:****@0...0/0000000000?'
             'from=&timeout=2880&voice=claire08s&'
-            'lang=fr_FR&media=SMS&format=text'
-            '&overflow=upstream&rto=4.0&cto=4.0&verify=yes'),
+            'lang=fr_FR&media=SMS'),
         # Our response expected server response
         'requests_response_text': SFR_GOOD_RESPONSE,
     }),
@@ -152,8 +149,7 @@ apprise_url_tests = (
         'privacy_url': (
             'sfr://service_id:****@0...0/0000000000?'
             'from=&timeout=2880&voice=claire08s&'
-            'lang=fr_FR&media=SMSUnicode&format=text'
-            '&overflow=upstream&rto=4.0&cto=4.0&verify=yes'),
+            'lang=fr_FR&media=SMSUnicode'),
         # Our failed notification expected server response
         'requests_response_text': SFR_BAD_RESPONSE,
         'requests_response_code': requests.codes.ok,
diff --git a/test/test_plugin_syslog.py b/test/test_plugin_syslog.py
index a0eeef6e..c0630a21 100644
--- a/test/test_plugin_syslog.py
+++ b/test/test_plugin_syslog.py
@@ -61,6 +61,9 @@ def test_plugin_syslog_by_url(openlog, syslog):
     assert re.search(r'logpid=yes', obj.url()) is not None
     assert re.search(r'logperror=no', obj.url()) is not None
 
+    # We do not support generation of a URL ID
+    assert obj.url_id() is None
+
     assert isinstance(
         apprise.Apprise.instantiate(
             'syslog://:@/'), NotifySyslog)
diff --git a/test/test_plugin_telegram.py b/test/test_plugin_telegram.py
index c8ba0364..af7d4422 100644
--- a/test/test_plugin_telegram.py
+++ b/test/test_plugin_telegram.py
@@ -1188,7 +1188,7 @@ def test_plugin_telegram_threads(mock_post):
 
     assert isinstance(aobj[0], NotifyTelegram)
 
-    body = 'my message'
+    body = 'my threaded message'
 
     assert aobj.notify(body=body)
 
diff --git a/test/test_plugin_windows.py b/test/test_plugin_windows.py
index bc1fb2d6..1bd6e198 100644
--- a/test/test_plugin_windows.py
+++ b/test/test_plugin_windows.py
@@ -111,6 +111,9 @@ def test_plugin_windows_mocked():
     # Test URL functionality
     assert isinstance(obj.url(), str)
 
+    # Verify that a URL ID can not be generated
+    assert obj.url_id() is None
+
     # Check that it found our mocked environments
     assert obj.enabled is True