

Yep. I also valid concerns. But those seem to be their next steps. I just wondered if there would be degradation. You wouldn’t even be able to tell until it reached the destination.
Definitely interesting stuff.
I’m the administrator of kbin.life, a general purpose/tech orientated kbin instance.


Yep. I also valid concerns. But those seem to be their next steps. I just wondered if there would be degradation. You wouldn’t even be able to tell until it reached the destination.
Definitely interesting stuff.


This is for communication, not computation or even cryptography. The point in transferring it this way is so as to maintain the unseen property of the photon.


I think my question on all this would be whether this would ultimately cause problems in terms of data integrity.
Currently most amplifiers for digital information are going to capture the information in the light, probably strip off any modulation to get to the raw data. Then re-modulate that using a new emitter.
The advantages of doing this over just amplifying the original light signal are the same reason switches/routers are store and forward (or at least decode to binary and re-modulate). When you decode the data from the modulated signal and then reproduce it, you are removing any noise that was present and reproducing a clean signal again.
If you just amplify light (or electrical) signals “as-is”, then you generally add noise every time you do this reducing the SNR a small amount. After enough times the signal will become non-recoverable.
So I guess my question is, does the process also have the same issue of an ultimate limit in how often you can re-transmit the signal without degradation.


Pretty sure this was made clear in the article but… I’ll outline the little I know on the subject as a complete layman.
Currently we have been able to use quantum effects to create single runs of fibre that cannot be intercepted. That is, if the data is intercepted by any known means the receiver will be able to detect this.
The shortcoming of this method, is that of course when you need to amplify the signal, that’s generally a “store and forward” operation and thus would also break this system’s detection. You could I guess perform the same operation wherever it is amplified, but it’s then another point in which monitoring could happen. If you want 1 trusted sender, 1 trusted receiver and nothing in between, this is a problem.
What this article is saying, is they have found a way to amplify the information without ever “reading” it. Therefore keeping the data integrity showing as “unseen” (for want of a better word). As such this will allow “secure” (I guess?) fibre runs of greater distances in the future.
Now the article does go into some detail about how this works and why. But, for the basic aspect of why this is a good and useful thing. This is pretty much what you need to know.


This is the world most of us want to live in I would think.
He is not the one to deliver it. He doesn’t really want that. If he did, he wouldn’t want $1t let alone fight to get it. Let alone all the other vile shit he has done, and will do.
Of course not. As the merovingian in the matrix says. French is a fantastic language, especially to curse with.


So far I’ve mostly avoided the whole “things that don’t need to be on the Internet” situation.
Non smart TV (well that period when they started adding smart features but they’re all out of date now so not even connected to the Internet)
All kitchen stuff is just kitchen stuff. No Internet.
Car is still offline.
Only real exception is smart thermostat, and that’s just because when the boiler was installed that’s what they put in.


It’s as true today, as it was then?


Sure. But it’s to Mars, with Musk.
Then I suggest they use an XNOR pointer instead! Checkmate patent trolls!


Or, you do the tutorial, play for an hour don’t come back for a year and don’t know what is going on.
Huh. I am sure you could search for individual books. For sure you could do it by goodreads ID I think? Yes, adding an entire author as the primary way to do things is a bit much for some. I know for sure I have managed to do individual books before now.
Yep, same. Well I actually remember finding the best ways to copy a game on a tape error free first. Some, without protection you could just save back to tape for a digital reproduction (and this also allowed tape to disk conversion). Actually those with non destructive copy protection could kinda be copied too if you knew a little Z80 ASM. Others, you needed to copy tape to tape and hope the quality turned out OK.
But yes, then bringing your box of copied disks (Amiga in my case) into school and swapping with your friends was the way to go.
That’s fine. I’ll make my own internet. With blackjack, and hookers. In fact, forget the internet!
It’s a real shame because Readarr did work and they really just needed to fix their own metadata servers. No? Or were there other problems I’m not aware of?


I mean, I have to say I’ve hastened my own demise (in program terms) by over-engineering something that should be simple. Sometimes adding protective guardrails actually causes errors when something changes.


Yes, had the same happen. Something that should be simple failing for stupid reasons.


Yep. It seems they haven’t changed a thing about the format. Probably a script much older than mine on their end is generating it too.


I have a tool that I wrote, probably 5+ years ago. Runs once a week, collects data from a public API, translates it into files usable by the asterisk phone server.
I totally forgot about it. Checked. Yep, up to date files created, all seem in the right format.
Sometimes things just keep working.
I would say, now it’s learning that actually sticking your head in the sand is only ever a delaying tactic. But, if it DID learn that, it’d mean it has surpassed us already.