Usenet problems

General discussions and other topics.
52 posts Page 3 of 6
by tufur » Sat Jul 30, 2016 2:36 am
I went and found a small reseller in the Netherlands, much less DMCA problems. I choose a throughput by number of logins deal that maxes my 24Mb bandwidth. It costs 4euros/month($4.50.) I have been on dslextreme provided Easynews and Supernews before and after the major consolidation of the usenet companies. They are expensive and they have problems serving everyone. I guess I will see how this one works out. It has been fully stocked and rocking for three hours. I found two others that looked interesting too. It is not as bad as I thought it would be.
by Guest » Sat Jul 30, 2016 7:37 am
How about just keeping the Supernews side of it? Yes, I know that it's throttled, but this way, Sonic doesn't have to worry about maintaining their own server architecture and let's Supernews handle it all.

(I'm assuming that supernews.sonic.net doesn't rely on the local news.sonic.net architecture. Maybe I'm wrong?)
by jneal » Sat Jul 30, 2016 2:36 pm
When I first joined Sonic back in the dial-up days, your fast free news server was a big reason. How is it that it was affordable back then but not now? Has the cost for maintaining a news server increased to the point that it can't be bundled into subscriber fees and marketed as a unique & worthwhile service of which virtually no other ISP provides?

And if that can't happen can we at least have a decent funeral? The passing of an era that some of us remember so dearly? Dane, you surely must have a soft spot for your early memories of BBS, Gopher, Archie, FTP and Usenet, no? Is .0001% of your budget that critical to be so callous about killing it?
by Guest » Sat Jul 30, 2016 9:59 pm
jneal wrote:When I first joined Sonic back in the dial-up days, your fast free news server was a big reason. How is it that it was affordable back then but not now?
I don't have visibility on Sonic's infrastructure but back in the "good ol' days," a daily Usenet feed would maybe fell in the gigabyte range while today the majority of the posts are binaries--many of which are questionable in nature--taking up hundreds of GBs if not a couple of TBs per day. Retain 1-5 years' worth and there's your answer.

I grew up on Usenet, after a quick foray into BITnet. It has a special place in my heart--I witnessed the creation of comp.sys.next and rec.arts.anime. But let's face it, it's no longer used by most people these days. Be happy you can subscribe to Usenet services at all. If you're into binaries, services in Europe are better because they don't respond to DMCA take downs as quickly as those in the States.
by ankh » Mon Aug 01, 2016 1:19 pm
To repeat the unanswered question
(I'd originally asked Dane, but welcome answers from anyone who knows)
Usenet; can you get access to it elsewhere? Yes, through Google Groups or any of the outside Usenet services.
Can you recommend one as trustworthy not to be filtering what we're allowed to see?

And, per the linked article recommending five -- which are the four resellers, and which one has its own code?

I understand https://www.ece.cmu.edu/~ganger/712.fal ... ompson.pdf.

But I recently pointed out -- to a young entrepreneurial programmer who's done well creating and selling three companies so far -- the section of Orwell's 1984 where people go out and edit libraries to change the facts that people will be able to find, and said how much easier this must be with computers.

He had never thought about the possibility.
by Guest » Mon Aug 01, 2016 9:44 pm
ankh wrote:To repeat the unanswered question
(I'd originally asked Dane, but welcome answers from anyone who knows)
Usenet; can you get access to it elsewhere? Yes, through Google Groups or any of the outside Usenet services.
Can you recommend one as trustworthy not to be filtering what we're allowed to see?
Assuming you're talking about DMCA takedown requests, everyone I've found so far will honor legitimate DMCA notices. Since this shields a service from legal liability, everyone takes advantage of that protection.

Some are too proactive by also acting on false DMCA requests (apparently, AstraWeb is like this), so you need to keep that in mind too ...
And, per the linked article recommending five -- which are the four resellers, and which one has its own code?
Per the linked article, the following are resellers of the Highlands/UNS Holdings root server:
  • Newshosting
    UsenetServer
    NewsDemon
    EasyNews
Only AstraWeb operates their own root server.

Also note that GigaNews/Supernews that Sonic is currently linked to also maintains their own root server.
by ankh » Mon Aug 01, 2016 11:28 pm
Thanks, I'm not worried about legitimate DMCA takedowns.
More concerned about political takedowns and rewriting history over the longer term.

Appreciate the info on who has their own service vs. resellers.
by Guest » Fri Aug 05, 2016 12:53 am
dane wrote:A Usenet server array isn't something purchased and deployed, it's a platform that must be built. [ ... ]
May I prevail upon you to expand on this a bit? I'm conceptualizing an NNTP server as a machine with a few TB of spindles that sits in a rack and more or less looks after itself. I'm not grokking what you mean when you say, "platform."
by drew.phillips » Fri Aug 05, 2016 9:53 am
Guest wrote:
dane wrote:A Usenet server array isn't something purchased and deployed, it's a platform that must be built. [ ... ]
May I prevail upon you to expand on this a bit? I'm conceptualizing an NNTP server as a machine with a few TB of spindles that sits in a rack and more or less looks after itself. I'm not grokking what you mean when you say, "platform."
My 2 cents on Usenet being a platform are:

- Peering with other news providers. You must establish relationships with other platforms to send/receive messages and for what hierarchies you exchange with each other. For example one peer may agree to send you everything from a particular sub, like sci.* and not for others. It's a complex set of relationships that dictates what messages you will and won't receive from other peers. It's a complex, distributed, server-to-server type of network.
- Retention. A few TB of storage could be a huge understatement. Depending on your retention period and subs, this can be in the petabytes. Even consider text only, with a hypothetical 30 MB of news per day to be retained for a period of 1 year (low) would require over 10 TB of storage (just for the content itself - not to mention filesystem overhead etc).
- Network bandwidth. You have to consider (even with a small number of users) the fact that a handful of people may pop on at the same time every couple of weeks and download a huge amount of content. If you serve media content in alt.bin imagine a few users on high speed connections trying to download a movie at the same time. They can either get their content quickly (requires a large amount of bandwidth) or it can take forever, resulting in a low quality of service.

There actually quite a bit of info at https://en.wikipedia.org/wiki/Usenet

Hope that helps.
Drew Phillips
Programmer / System Operations, Sonic.net
by ewhac » Fri Aug 05, 2016 11:18 am
drew.phillips wrote:Hope that helps.
It does; thank you. Clearly my sense of Usenet traffic volume is badly out of date.
52 posts Page 3 of 6

Who is online

In total there are 4 users online :: 0 registered, 0 hidden and 4 guests (based on users active over the past 5 minutes)
Most users ever online was 2877 on Wed Sep 25, 2024 9:53 pm

Users browsing this forum: No registered users and 4 guests