If models are trained on data that it would be a security breach for them to reveal to their users, then the real breach occurred at training.
If models are trained on data that it would be a security breach for them to reveal to their users, then the real breach occurred at training.
Rogers. Fred McFeely Rogers. If you’re over the age of 30 that’s the answer, and if not then it’s still a good answer.
Never mind the small fry. The word “put” has enough different meanings to fry your CPU.
In the 24th century, I mean. It wouldn’t have been 1980s USSR-style Communist at any rate.
I don’t know if Russia or indeed all of Earth was communist when Worf grew up, but the whole Klingon Empire was widely recognized by people of the 20th century as resembling the Soviet Union, which was nominally communist during the Cold War era of original Trek. Memory Alpha even mentions it. Mostly just because they were “the bad guys” I suppose.
For a minute there I forgot Communist Himbo was even in DS9. I should probably re-watch it after I"m done with Voyager, it’s been a good 20 years.
Regardless of how much or how little support one wants to give Ukraine (Slava Ukraini!) it always struck me as bizarre that people thought it reasonable to give them missiles and then expect them not to use those missiles against the country they’re at war with.
I am searching with /x
On most systems these days you can use regular expressions there. If /-x
isn’t good enough try /-x[ ,]
or whatever.
It’s the easy way to spend 2% of GDP on the military: Lower the GDP.
Windows was but a brief interlude between AmigaOS and Linux.
I wonder what kind of training it takes to become capable of deciding that the thing your ammunition vending machine project really needs is Artificial Intelligence. Does the NRA have an MBA program?
I wonder how long this existed before it got discovered.
Support for it already seems to be there in wine, so rather than wait for 6.11 I think I’ll just go ahead and apply the patches myself to 6.10-rc7 and see if it makes any difference to the one game I regularly play. If my computer blows up as a result I’ll let y’all know.
(Result: None. The versions of wine I have probably need patching or at least configuring in order to use it. In the course of briefly considering trying to work out how to do that, I discovered that the expected improvements are not nearly as dramatic as were suggested compared to what’s already most often done in proton (fsync). The main benefit for most of us will be better compatibility, not huge performance gains. Well at least my kernel is ready for it.)
More recently, from Phoronix:
While the initial driver patches were merged to char/misc and now in turn within Linux 6.10 Git, much of the enablement work wasn’t accepted in time. Thus for Linux 6.10 the new NTSYNC driver is marked as “broken”, so it won’t even be built for normal kernel builds.
Hopefully for Linux 6.11 or sometime soon the rest of the NTSYNC patches are upstreamed for yielding this massive boost to Windows games on Linux.
PET-enabled home routing
Oh, apparently it’s a “5G” thing. Perhaps everyone in Europe knows that already. Apparently the design of the new network is complicated enough that they’ve accidentally left room for just a little bit of user privacy. Europol claims to have become dependent on the situation where people using mobile phones have none at all.
spotted the new ad format during their commute
Are people really using google maps during their regular commute now?
N: Unable to locate package mit E: No packages found
Whatever it is, it doesn’t seem to be too popular. Not even in debian.
Maybe Gygax said genuinely sexist things about “women’s lib” when he got defensive about the topic, but if the only example you can find of that being reflected in D&D is that one time they added a male dragon who was good and a female dragon who was evil it seems difficult to justify writing that many words about it.
Polls were not good before the debate. Maybe it’s too early to say what the effect will be but here’s the second report I’ve seen saying it’s getting worse.
It’s a reasonable question, and the answer is perhaps beyond my ken even though I’ve had substantial experience with both building machine learning models (mostly in pre-LLM times) and keeping computer systems secure. That a chatbot might tell someone “how to make a bomb” is probably not a great example of the dangers they pose. Bomb making instructions are more or less available to everyone who can find chemistry textbooks. The greater dangers that the LLM owners are trying to guard against might instead be more like having one advising someone that they should make a bomb. That sort of thing could be hazardous to the financial security of the vendor as well as the health of its users.
Finding an input that will make the machine produce gibberish is not directly equivalent to the kind of misbehaviour that often indicates exploitable bugs in software that “crashes” in more conventional ways. But it may be loosely analagous to it, in that it’s an observation of unintended behaviour which might reveal flaws that would otherwise remain hidden, giving attackers something to work with.