How much traffic do you use on the net a day.

Who are you with? I'm with Internode and my 40gb plan just isn't enough.

You ready for it....

Bigpond!

I think i'm that one-in-a-million lucky customer who gets everything they could ever want. When I switched from 512/128 to 1.5/256, they actually switched me to an 8mb plan for a good 3 months. In the end I had to call for something that needed fixing in regards to an email account issue, and the way the system worked they just couldn't do it without noticing the discrepancy. When asked if I knew, I just said "Huh? What's that?" and they 'fixed' it :LOL:

Still, unlimited is unlimited is unlimited, and i'm a happy camper.
 
sometimes maybe a gig, other times I'll be getting as much stuff at my throughput allows ;)
I will say my usage has gone way up since I've gotten a usenet server ;)
 
Usenet hurts my brain. I spent 3 hours in at work the other day just finding a client, then another hour finding servers to get on...then 30 minutes finding something to download, then 3 years trying to work out how to download all of it, because despite it telling me that everything was linked, it wasn't.

Why can't it just be simple. Download generic client, click join server from list, search trough all available files, download what you want. But instead you get some jumbly ugly-ass client that's so horrible to try and navigate you'd rather just commit suicide.

What the heck am I missing? Do people just put up with this crap because you get good speeds? :devilish:
 
Usenet hurts my brain. I spent 3 hours in at work the other day just finding a client, then another hour finding servers to get on...then 30 minutes finding something to download, then 3 years trying to work out how to download all of it, because despite it telling me that everything was linked, it wasn't.

Why can't it just be simple. Download generic client, click join server from list, search trough all available files, download what you want. But instead you get some jumbly ugly-ass client that's so horrible to try and navigate you'd rather just commit suicide.

What the heck am I missing? Do people just put up with this crap because you get good speeds? :devilish:
It's real simple. There are websites that let you easily download stuff without having to mess with headers yourself and after that there are clients that automate the process of downloading and repairing and extracting and then deleting what isn't needed anymore. I have a good guide on how to use Usenet. You can find my guide over at Rage3D. Searching by my username and for Usenet guide should have it pop up for you. There's also the site Slyck where you can find a guide on Usenet too and a forum.
 
Last edited by a moderator:
The Best Usenet program out there is AltBinz. It's the Usenet equivalent of uTorrent. You configure your server(s), connect, and have it watch a directory for NZB files. You download the NZB files from other sites or search using the built-in supported sites. The program does all the rest. It will grab the PAR/PAR2 file first, then the main file, perform PAR checking, grab any additional PAR files needed, repair the archive, and optionally extract it to a configured location.

It uses very minimal resources. At full speed (1608-2048 KBit/sec) and program window open it never used more than 32 Meg of memory. When it was finished pulling everything down, it dropped down to 9 Meg. When minimized it dropped down to 2.1 Meg and when restored went to 8 Meg.
 
The NZB File is nothing but an index file that designates what articles/posts are included. With that said, the library of NZB is all of Usenet/News Groups.

From Wikipedia said:
NZB is an XML-based file format for retrieving posts from NNTP (Usenet) servers. The format was conceived by the developers of the Newzbin.com Usenet Index. NZB is effective when used with search-capable websites. These websites create NZB files out of what is needed to be downloaded. Using this concept, headers would not be downloaded hence the NZB method is quicker and more bandwidth-efficient.

Each Usenet message has a unique identifier called the "Message ID". When a some-what large file is posted to a Usenet newsgroup, it is usually divided into multiple messages (called segments or parts) each having its own Message-ID. An nzb-capable Usenet client will read the Message-ID from the NZB file, download them and decode the messages back into a binary file (usually using yEnc or Uuencode).

Using dedicated Usenet Index websites, such as Newzbin.com itself, the user is able to create an NZB by selecting a range of files that they wish to obtain from Usenet.
 
How big library is there with NZB files?

There's quite a few NZB indexers our there, so rather than waiting for a post with a NZB, they collect all the headers, and then generate a NZB on-the-fly from whatever headers you choose. You choose what you want based on the results output from the search engine, and then just use the generated NZB (which is just a list of message-ids) to ask a usenet server for those specific articles.

It quite different and a lot more focussed than the browsing experience you get when looking at all the headers from the complete contents of a newsgroup.
 
Linux pretty well my rules is trust all out going connection i.e. NAT plus a few holes poke through for BT ect
 
Usenet hurts my brain. I spent 3 hours in at work the other day just finding a client, then another hour finding servers to get on...then 30 minutes finding something to download, then 3 years trying to work out how to download all of it, because despite it telling me that everything was linked, it wasn't.

I used to use a truly wonderful windows usenet program called "gravity". It wasn't free but, then, nor was it expensive. Unfortunately, IIRC, the company went out of business but they did release their final version for free. I have no idea if it works with the latest Windows versions but I do think it might be worth a try.
 
At the moment I'm averaging around 1 GB - usually, it's more like 2 or 3 a day, depending on what apps are running. ;)

Still 10Mbps down / 1 Mbps up is quite nice I must say. And that in Switzerland too!
 
That site is an automatic bot indexer and has problems with cryptically named files and sometimes doesn't index some files for some reason. After using a ton of Usenet search sites I came back to the one I used first as the best, the site that created the NZB file. It has the editor reported files which are nicely categorized. You can also search raw Usenet headers directly and another search option is being able to search NZB files that were indexed automatically directly from Usenet.
 
That site is an automatic bot indexer and has problems with cryptically named files and sometimes doesn't index some files for some reason. After using a ton of Usenet search sites I came back to the one I used first as the best, the site that created the NZB file. It has the editor reported files which are nicely categorized. You can also search raw Usenet headers directly and another search option is being able to search NZB files that were indexed automatically directly from Usenet.
Yes, but that only works for very recent posts.
 
What does? I can search in all three search methods as far back as 100 days, the max that site is capable of. The retention is a setting in the user control panel.
 
I'm a collector and enthusiast and think I average about 5gb down per day and 40gb up. But with some severe spikes then and again as I buy more hard drives, waiting for the Hitachi 1tb to reach the stores anytime now.

100/100 mbit is the sweet life :)
 
Back
Top