Recent comments

onion OP wrote

Eric Washabaugh served as a targeting and technology manager at the CIA, where he served from 2006 – 2019, leading multiple inter-agency and multi-disciplinary targeting teams focused on al-Qa’ida, ISIS, and al-Shabaab at CIA’s Counterterrorism Center (CTC). He is currently the Vice President of Mission Success at Anno.Ai, where he oversees multiple machine learning-focused development efforts across the government space.

PERSPECTIVE — As the U.S. competes with Beijing and addresses a host of national security needs, U.S. defense will require more speed, not less, against more data than ever before. The current system cannot support the future. Without robots, we’re going to fail.

News articles in recent years detailing the rise of China’s technology sector have highlighted the country’s increased focus on advanced computing, artificial intelligence, and communication technologies. The country’s five year plans have increasingly focused on meeting and exceeding western standards, while constructing reliable, internal supply chains and research and development for artificial intelligence (AI). A key driver of this advancement are Beijing’s defense and intelligence goals.

Beijing’s deployment of surveillance in their cities, online, and financial spaces has been well documented. There should be little doubt that many of these implementations are being mined for direct or analogous uses in the intelligence and defense spaces. Beijing has been vacuuming up domestic data, mining the commercial deployment of their technology abroad, and has collected vast amounts of information on Americans, especially those in the national security space.

The goal behind this collection? The development, training, and retraining of machine learning models to enhance Beijing’s intelligence collection efforts, disrupt U.S. collection, and identify weak points in U.S. defenses. Recent reports clearly reflect the scale and focus of this effort – the physical relocation of national security personnel and resources to Chinese datacenters to mine massive collections to disrupt U.S. intelligence collection. Far and away, the Chinese exceed all other U.S. adversaries in this effort.

As the new administration begins to shape its policies and goals, we’re seeing typical media focus on political appointees, priority lists, and overall philosophical approaches but what we need is an intense focus on the intersection of data collection and artificial intelligence if the U.S. is to remain competitive and counter this rising threat.

2

burnerben wrote

"we have to turn to violence because they wont listen" you guys are the fucktards that elected another dimmensia ridden pedo that wont do shit for our country. the only good thing biden will do is his climate shit other wise he'll just be another shitty president.

1

Kalchaya wrote

This is not the first time Cloudflare has done this stuff. I forget who was targeted previously (8chan? voat? Someone else?). The point being, who but a 'tard would trust them not to do the same crap again? If someone sucker punches you, are you really that stupid to let him move behind you ever again? Apparently yes.

2

Elbmar wrote (edited )

I was surprised by this

“Law enforcement… use these tools to investigate cases involving graffiti, shoplifting, marijuana possession, prostitution, vandalism, car crashes, parole violations, petty theft, public intoxication, and the full gamut of drug-related offenses,” Upturn reports.

This other article it linked to about consent searches was interesting too.

Imagine this scenario: You’re driving home. Police pull you over, allegedly for a traffic violation. After you provide your license and registration, the officer catches you off guard by asking: “Since you’ve got nothing to hide, you don’t mind unlocking your phone for me, do you?” Of course, you don’t want the officer to copy or rummage through all the private information on your phone. But they’ve got a badge and a gun, and you just want to go home. If you’re like most people, you grudgingly comply.

Police use this ploy, thousands of times every year, to evade the Fourth Amendment’s requirement that police obtain a warrant, based on a judge’s independent finding of probable cause of crime, before searching someone’s phone.

https://www.eff.org/deeplinks/2021/01/so-called-consent-searches-harm-our-digital-rights

2

Elbmar OP wrote (edited )

This is old news but it was news to me. I remember the 8chan users migrating to Zeronet in 2019 but I was not aware that so many had their IP addresses exposed.

Peer-to-peer networks expose a user’s internet address to anyone who cares to look. That’s how copyright lawyers catch people trading movies, music and software, and it’s how police and FBI agents arrest pedophiles trading child porn online.

ZeroNet works the same way, a fact that’s been much-discussed on the new site. For that reason, ZeroNet integrates tightly with Tor, an anonymity system that places layers of cut-out addresses between a user and the websites they visit. But only 41 percent of 08chan’s users’ are using Tor, based on our analysis of the peer-to-peer traffic at the site.

Users on 08chan have been complaining that the site is buggy and slow over Tor, and the site’s own administrator initially encouraged anons to just connect directly.
...
The Daily Beast captured 819 IP addresses for 08chan users connecting from 62 different countries.

Also, the users were concerned about child porn

“Say someones a f****t and uploads cp [child porn],” one Zeronet user wondered on Wednesday, as 8chan users flooded Zeronet’s discussion board. “If i happen to download it, it gets shared from my computer, right? And if i dont notice it bunch of people can download it from me? So im a distrubutor at that point, arent i?”

https://www.thedailybeast.com/8chan-refugees-worried-theyre-downloading-child-porn-on-peer-to-peer-site-zeronet

I think the main takeaways from this is that an ideal network for users of 8chan, Parler etc. to migrate to would be

  • Anonymous by default, and fast regardless.
  • Text only, so that no one worries about accidentally downloading and distributing child porn.

I'm aware that even a regular web browser downloads child porn when a user views a page with it. But anyone who comes across child porn is required by law to report it. That puts people in a weird legal position if they come across it by accident while using a network like zeronet. Legally and morally, the correct thing to do is to remove it from the zites you are distributing and then report it but reporting might bring more attention to yourself even if you use the anonymous report feature, especially since you were briefly a distributor.

I think using a peer to peer network is one thing they got right, because there's no individual that can be threatened, coerced, or persuaded into shutting 08chan down.

1

Elbmar wrote

People can delete their messages but I haven't seen it happen enough that it really bothers me.

Yeah it's preferable for news stories to remain up forever. Maybe IPFS could eventually become popular enough that news organizations use it as well. But in the meantime archivists can use it to archive news stories permanently. I agree that it's important for news articles, scientific articles, statements from politicians etc. to not be memoryholed. But ideally, right wing groups should use private anonymous networks with auto-disappearing messages because it's safer. Members being targeted by law enforcement has a much worse effect on a group than any negatives that might come from people deleting their own messages.

1

Wahaha wrote

If you're participating in a discussion and then memory hole your contributions, nobody can read up on the discussion, since part of it is missing. You could also write up a news story and then memory hole it yourself, if you feel like it.

The ability to remove something you published can be used maliciously. Thus, one of the points of decentralization is to prevent anyone from even having that ability.

1

Elbmar wrote

Not sure what malicious use would be. I haven't ever seen the type of drama where someone says something, deletes it, and then denies ever saying it and gets into arguments with people about it.

Ultimately, advantages are subjective for different people. You value posts existing forever but many people prefer the opposite. Signal is popular partially because of the disappearing messages feature. I think especially on the right, people will increasingly value privacy over convenience. I think we are probably heading into a very totalitarian, technocratic future where it will be more and more dangerous to have right wing views.

Personally, if I see a very interesting post online, I sometimes just save it in a document on my computer. If scuttlebutt implements the delete message feature, it would be nice for them to also have a save message feature that saves the message but not the username. Or allow users to just remove their identity from messages that they don't want associated with themselves any more. Similar to how reddit shows [deleted] for the username after someone deletes an account.

Patchwork and apps like it could agree to not show deleted messages in their user interface. That way, if someone was making backups, it would be harder to read deleted messages. It would still be possible, but the person doing it would need to know how to decrypt them. Don't know if that would be a desired feature by the community or not, but it would be a way to get the delete feature as complete as possible.

1

Wahaha wrote

I can see why people would want that feature, but it wouldn't change that somebody would have the ability to memory hole something, which isn't desirable, since it can be used maliciously and thus has the ability to harm trust.

If I can't trust for everything to remain there forever, there's no big advantage over centralized solutions.

Luckily, by design, all the content I see ends up saved on my computer, so with a differential backup, it should be trivial to go back in time and read memory holed posts.

1

Wahaha wrote

GDPR only applies to personal data. Whatever you posted is still fair game. Especially if it was under a pseudonym in the first place. It's different from the "right to be forgotten".

Also, on a technological level this process isn't automated. Someone has to go in there, make sure it's your data and delete it manually from the database. It could be automated in the future, but it wasn't in the past and without building everything from scratch again, it also won't be in the future.

Also, I'm an IT guy from Europe that is very fortunate that no one ever asked for shit to be deleted. But on the bright side, even if somebody did, there's still no way for them to verify that we actually deleted everything. So reasonably, all we have to do is to no longer expose their information and nobody would be any the wiser.

1

Elbmar OP wrote

Btw, I just found the part of the docs that explains how their cancel culture type views have influenced the protocol. You can publicly block someone, and that is announced to your peers. So for example, if a popular leftist scuttlebutt user publicly blocks someone saying it is because "he is a racist/sexist/homophobe/whatever", there would probably be peer pressure for others to publicly block the same person.

If someone is bothering you or saying things you don’t want to hear, you can block or ignore them from their profile page (in the Options button). This will hide their messages and comments from you. You can loudly block someone, or quietly ignore them. A block is public and everyone can see it. Blocking is a way to demonstrate community norms and alert your friends to someone they may also want to block. Sometimes it starts useful conversations. Ignoring quietly is a secret action that only you will know about. It hides the person from your view.

https://scuttlebutt.nz/docs/introduction/detailed-start/#stay-happy-and-safe

It could be pretty useful feature for the right as well though. If some user was posting child porn and a peer publicly blocked them for that reason, I would appreciate getting a heads up so I could block them as well. Same if leftists attempted a raid on right wing "pubs" and users. They could be blocked.

2

Elbmar wrote (edited )

I think the main advantage of decentralized over centralized is that other people can't memory hole your posts. If you can memory hole your own posts, that is an advantage. If you ever get in trouble with the law, it's helpful to have no online history that they know about. Ideally, they will not know your username, but the right is too online now compared to the left. The right really should be using the internet to facilitate offline organizing more often, and that introduces the possibility of law enforcement knowing your online identity. But for example, if you are defending yourself from Antifa and get charged with assault, you may be happy if you deleted all your posts before meeting up with people so nothing you said can be twisted and used against you (though they might say it's suspicious that you deleted all your posts. It's nice that in Matrix, changing your password encrypts all your old posts by default, which looks less suspicious). The NSA or FBI could certainly still have the posts you deleted and know that you made them but local law enforcement is not so sophisticated.

I think you could have scuttlebutt or something like it, which stores all messages for you to read offline, but also have a feature where if you say that you want all of your posts deleted, then your computer could send that message out to all of your peers. They would forward that message to any of their peers who can also read your messages. (See the "Follow Graph" here https://ssbc.github.io/scuttlebutt-protocol-guide/#follow-graph ) The peers that are already online would respond immediately and delete your posts from their local store. Some of your peers and peers of peers with access to your posts could be offline so they would still retain your posts temporarily, but when they connect to the internet again, those peers would see that you want your posts deleted, either by checking with you or their peer who is connected to you, and they would immediately delete them as well.

In the scuttlebutt documentation I saw that in the future they do want to allow people to delete posts and it is just a feature they haven't implemented yet. They also want to hide IP addresses by default.

We want Scuttlebutt to be a safe cozy place but there are still some things we need to fix: Blocked people can see your public messages.
Content from blocked people is still on your computer. (This is almost fixed!)
Patchwork has some bugs that let you see blocked people in certain situations when they should be hidden
Scuttlebutt doesn’t provide IP address anonymity by itself, but you can use it with a VPN or Tor.
Messages can’t be deleted yet.

https://scuttlebutt.nz/docs/introduction/detailed-start/#stay-happy-and-safe

1

dontvisitmyintentions wrote

The way we learn to become adults is by learning to think. Stifling speech prevents individuals from engaging in dialog that may lead them to learn to think through problems. Stifling free speech leads to anger, an emotion that blocks rational thought and encourages petty little people to remain children.

Very paternalistic. The author allows for no righteous anger and no resentment. There is only the argument against misinformation, just like the mainstream liars make. It's false.

2

Wahaha wrote

You wouldn't have to do anything complicated like that. Just create regular differential backups of everything, then you can go back in time and see the posts again. One of the points of decentralized networks is that you can still read everything, even without Internet. So if you design it in a way that requires an internet connection to read posts, it's no longer decentralized.

Another point is, that the reason people want to use decentralized solutions is so that nobody has the ability to memory hole anything. Not even typos. If that's not the case, then what's the advantage over centralized stuff?

1

Elbmar wrote

Matrix is federated, not p2p, but when using it I noticed that if I changed my password, the encryption key for my posts would change as well which would make all of my past posts unreadable to everyone including myself, but my new posts would be readable. Of course if my past password was weak, it would still be easy for someone to decrypt my past posts.

It was possible to delete and edit posts as well. And if you disabled an account, you were met with a warning saying that people would not be able to read your past posts, which may disrupt the flow of conversations. Also, creators of a room could set it up so that any new user had no ability to view the old posts in the room. You could change your display name at any time, but your unique id is the name you chose when signing up. Your unique id is visible to anyone who right clicks on your display name.

When it comes to p2p tech, so far everyone is saying what you are suggesting is impossible, but I am at least interested to know whether it would make sense to code something similar to this, or if something similar already exists:

All posts are encrypted. nodes you connect to store your posts, but in encrypted form, and they store the encryption key for your posts. They store a generated unique id, not your display name. So if someone wants to save your posts to use against you, they have to have some basic technical capability. They need to know your account's unique id, not display name, and use the stored key to decrypt the posts associated with that id. (Most would just screenshot it in this case, which can be more easily faked so there is more plausible deniability for you)

You can change your encryption key at any time. If you change the encryption key for your posts, then the key will be changed for all nodes connected to you, making your past posts unreadable to yourself and connected nodes

if any node disconnects from you or you disconnect from it, your files automatically get deleted from their store and their files get automatically deleted from your store.

If someone really wanted to hold on to someone's posts to use against them later, they could of course make a copy of the store before they disconnect from the other node, but they would need some basic tech knowledge to decrypt what is in it. Unlike making an archive link of some centralized page which requires almost no tech knowledge. If the p2p network gets popular enough, someone might make a service to simplify this process for people (similar to archive.org). But privacy would at least be comparable to centralized services.

But I know jack shit about coding p2p protocols and applications.

2