![]() Registered Member ![]()
|
Now for something completely different
![]() I was listening to MacBreak Weekly a while ago, when they mentioned that there is an "offline Wikipedia-reader" for the iPhone. What the app does is that it downloads the entire Wikipedia on the phone, minus the pictures. Apparently it takes about 2 gigs of space. How about having something similar for KDE as well? I know: "Why not simply read it online?". Well people are not online all the time. We are more and more connected these days, but we are not 100% connected. If you are in a plane, you have no net-connection, but you might still like to read Wikipedia. I would see a system like this being very useful in netbooks and the like, since people travel with those, but network-connection is not guaranteed. 2 gigs seems like a reasonable amount of space, but we could also handle pictures. It would be up to the user. I could see 4-8gigs of space be dedicated to Wikipedia. The user would then have all the information he could ever want with him, all the time. Thoughts?
Freedom is not a destination, it's a journey
|
![]() Alumni ![]()
|
Well, my netbook (eeePC 701) has only 4gb space, so wouldn't want to have the whole Wikipedia on it... ,-)
But I, too, thought of saving some relevant articles on a usb stick for my holiday this year, so this idea might be not that bad... It is already possible to use website archiving tools to download a whole web page - there's even a firefox addon for this - although I don't know how it would react on the "2 599 000+ articles" of the English Wikipedia. ,-) Of course, it would need an updating tool to bring it up-to-date when you are online again... Indexing the data for a reasonable searching function probably also takes some disk space. For reducing the needed space it would be good if the user could decide to download only some categories; or perhaps only article and those that are linked from this one...
michael4910, proud to be a member of KDE forums since 2008-Oct.
|
![]() Mentor ![]()
|
Perhaps there are some tools from wikipedia which are used by mirror hosters. They should also update only the new parts and not the whole database at each update.
![]() [size=x-small]code | [url=cia.vc/stats/author/msoeken]cia.vc[/url] | [url=kde.org/support]donating KDE[/url] | [url=tinyurl.com/cto4ns]wishlist[/url][/size] |
![]() Registered Member ![]()
|
Yeah, there could be "issues" ![]()
Yeah, it's possible in theory, but the process is far from ideal. I would like an app that asks you which wikipedias you want to download, do you want images as well.... User could also define a "depth" in which he wants to go, if he doesn't want all of it. He could say (for example) "only go three links deep from the main page" (even just three links would bring A LOT of stuff. Of course it would exclude non-wikipedia-links). That way the user could read the articles and even read the interlinked articles.
Naturally ![]()
Maybe. But with the amount of storage-space going through the roof, I feel that that is less and less of an issue.
Yes ![]()
Freedom is not a destination, it's a journey
|
![]() KDE Developer ![]()
|
anda_skoa, proud to be a member of KDE forums since 2008-Oct.
|
![]() Registered Member ![]()
|
Thank you Mister ![]()
Freedom is not a destination, it's a journey
|
![]() Alumni ![]()
|
Looking at the Roadmap and the changelog, I would say they definitely need some more man power..
![]()
michael4910, proud to be a member of KDE forums since 2008-Oct.
|
Registered users: Bing [Bot], Evergrowing, Google [Bot]