Just thoughts, just ideas. I'm not an expert, this is brainstorming. Umbrella reccommended.
When you type in an URL, it gets sent to a DNS server. This converts
https://example.com ( A URL! )
...to...
104.18.27.120 ( An IP address! )
( source: https://dnschecker.org/#A/example.com )
This is the DNS Protocol in action.
Can you imagine if this wasn't part of the web? Imagine if you had to manually type in, say, 142.251.41.174 ( hey, when I pasted that, leaflet auto-embedded an IP locator site, pretty neat! )
Anyway, imagine having to type in an IP address to get to a site that provided the ever-so-necessary service of converting your URL to an IP address.
Right, that seems illogical. Why go through a service to get your IPs when you can build that into the Web!
The service ( or all competing services ) would have to track and index web sites, or have pages dedicated to submitting your site, and probably fight for IPs with numbers that spell out cool things.
Glad we implemented DNS as a protocol, right?
What if...
What if you could go a step further?
What if you could do that for search queries?
I don't know, it could be possible... ( is it possible? ) I don't know, imagine if it was!
You type in a search query.
Miss Query goes to a SQP server.
Officer SQP looks some keywords Miss Q has and points her off in the right direction, this being the cooking area, as she is a Query for the Duration a Pot Roast must be Slow Cooked for.
Miss Q travels to a sub node (see, this is like how DNS servers are in a hierarchy of domains) and is there directed to the Cooking Of Meats SQP server, where she gives her address of origin.
The server applies some Algorithms to Miss Q's query and ships off a list of links to you, who recieve them and are on your way to cooking that pot roast.
What are the benefits?
- 1.
Indexing, namely the ease of. If this is built-in to the Web, why, you can index every site as soon as it pops up, no need for messy crawlers and scrapers and bots stealing up traffic to ensure they always have the latest, the best, the most relevant version of your site
- 2.
Decentralization. This would ensure that nobody controls your requests, and thus what you can see on the Internet.
- 3.
Perhaps I'm a bit Mr. Ulterior Motive, but just imagine the data! Tons of it! Hopefully public-access and anonymized, but this would be neat! Lots of statistical algos to run on it. LOTS.
Now, I've exhausted my list of (hypothetical!) (unresearched!) ideas, and I never know how to end an article, so BYE! ADIAU! hops in getaway vehicle and speeds off into the sunset