Poll: Good idea?
You do not have permission to vote in this poll.
Yes
100.00%
6 100.00%
No
0%
0 0%
Total 6 vote(s) 100%
* You voted for this item. [Show Results]

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Multi-domain search spider.
#1
Here's an idea inspired by the need for housing a million files andonly having a small space to do it.

A search engine which meets the follofing requirements:

- Search interface can be used on any site, and be formatted css style to suite the site, or modified manually if the css doesn't exist.
- Have one central point (in this case www.qbasicnetwork.com because I want it on my site) like any search engine.
- The ability to allow users to submit a folder of their site to the search engine.
- The ability for each site in the list to search through those folders for a MySQL database and update it to the central hub every 24/48 hours.

When a search query is sent, the query goes back to the hub and the databases are searched. The results would be displayed on that site the query was sent from in the style of the site, or not, depending on the site's admin.
- If a site goes offline, the results and links are still displayed, but shown as offline, with a reason.
- Reason for being offline will be mentioned by the file host when they conduct mainenance.

Any changes will be updated within 24/48 hours.

Pro's:
- Eliminates storage problems
- Database is never completely offline for mainenance
- Everyone contributes

Con's:
- Possible permission problems
- May not be possible due to a number of facts, but perhaps this can be eliminated?

Don't ask me about a prize, I have nothing to offer. I'd just like to see it done, since I have all these files and nowhere to put them.

If I am not being clear enough, let me know.

>anarky
Screwing with your reality since 1998.
Reply
#2
No suggestions?

>anarky
Screwing with your reality since 1998.
Reply
#3
Oh I think you're clear enough yes...because of the nature of multi spidering...it probably could be done as long as you know what the "securities" are on each domain where the files are hosted and code things accordingly.
hen they say it can't be done, THAT's when they call me ;-).

[Image: kaffee.gif]
[Image: mystikshadows.png]

need hosting: http://www.jc-hosting.net
All about ASCII: http://www.ascii-world.com
Reply
#4
Well that's all included, but for the network to crawl a site, the admin must register a folder, and set permissions on their server accordingly.

>anarky
Screwing with your reality since 1998.
Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)