# get-url Enhanced URL Search The `get-url` command now supports searching for URLs across all stores with automatic protocol and `www` prefix stripping. ## Features ### 1. **Protocol Stripping** URLs are normalized by removing: - Protocol prefixes: `https://`, `http://`, `ftp://`, etc. - `www.` prefix (case-insensitive) ### 2. **Wildcard Matching** Patterns support standard wildcards: - `*` - matches any sequence of characters - `?` - matches any single character ### 3. **Case-Insensitive Matching** All matching is case-insensitive for domains and paths ## Usage Examples ### Search by full domain ```bash get-url -url "www.google.com" # Matches: # - https://www.google.com # - http://google.com/search # - https://google.com/maps ``` ### Search with YouTube example ```bash get-url -url "https://www.youtube.com/watch?v=xx_88TDWmEs" # Becomes: youtube.com/watch?v=xx_88tdwmes # Matches: # - https://www.youtube.com/watch?v=xx_88TDWmEs # - http://youtube.com/watch?v=xx_88TDWmEs ``` ### Domain wildcard matching ```bash get-url -url "youtube.com*" # Matches any URL starting with youtube.com: # - https://www.youtube.com/watch?v=123 # - https://youtube.com/shorts/abc # - http://youtube.com/playlist?list=xyz ``` ### Subdomain matching ```bash get-url -url "*example.com*" # Matches: # - https://cdn.example.com/file.mp4 # - https://www.example.com # - https://api.example.com/endpoint ``` ### Specific path matching ```bash get-url -url "youtube.com/watch*" # Matches: # - https://www.youtube.com/watch?v=123 # - http://youtube.com/watch?list=abc # Does NOT match: # - https://youtube.com/shorts/abc ``` ## Get URLs for Specific File The original functionality is still supported: ```bash @1 | get-url # Requires hash and store from piped result ``` ## Output Results are organized by store and show: - **Store**: Backend name (hydrus, folder, etc.) - **Url**: The full matched URL - **Hash**: First 16 characters of the file hash (for compactness) ## Implementation Details The search: 1. Iterates through all configured stores 2. Searches for all files in each store (limit 1000 per store) 3. Retrieves URLs for each file 4. Applies pattern matching with normalization 5. Returns results grouped by store 6. Emits `UrlItem` objects for piping to other commands