If I want to use one of the hyper* tools to partially syncronize files p2p on a LAN, which one should I use?
The hyper* world seems to be very fragmented right now. There is the dat [0] project, which started 2013 and shares files between computers p2p. In may 2020 the dat protocol was renamed to the hypercore protocol [1] and dat "will continue as the collective of teams and projects that have evolved from the original Dat CLI project.". hypercore-protocol [2] links to multiple applications for file sharing (none of them the dat CLI tool).
Hyperdrive [3] "help[s] you share files quickly and safely, directly from your computer [...] -- all P2P". The github [4] mentions the hyperdrive daemon als a batteries included experience, but the hyperdrive daemon has a deprecation notice and tells you to use hyperspace.
Hyperspace [5] "provides remote access to Hypercores and a Hyperswarm instance" and "exposes a simple RPC interface". The documentation is very technical and seems to be aimed at developers, not using it as a tool.
Digging around in the github organisation [6] or by stumbling upon the patreon [7] one can find the hyp CLI tool which is "A CLI for peer-to-peer file sharing (and more) using the Hypercore Protocol". The first commit is 9 days old. On the twitter of the author one can also find hyperbeam [8], which is integrated into hyp.
Here on HN one can also find an announcement for "uplink" [9].
The tech looks pretty cool, but the vast amount of different projects makes it difficult to grasp. From all of these tools the dat tool seems to be the most advanced, but not actively maintained. It's not linked on any of the hyper* sites and doesn't seem to be the recommended way to use the hypercore protocol to share files p2p. While I would like to use the tech, I'm pretty lost on how to do that today.
First off, as someone who has spent many hours trying to use dat, I agree 100% with your assessment.
I have been using dat command line tool to p2p synchronize files on a LAN (and also over the internet) for a year or two now. The latest version is more stable for me than than the previous version and I really love it. It synchronizes files across multiple computers nearly instantly.
I'm glad the dat cli is still working for you. It's on an older protocol codebase and won't be maintained, the hyp cli is meant to replace it going forward.
If someone would decide to use either dat or hyper as a base for a project today: what are the differences?
Is hyper going to replace dat as an advancing network protocol? Is peer discovery different, or did hyperdrive change...or...?
(I am very confused about dat/hyper efforts, as I've been following it a bit over the last year)
I've seen that beaker changed to hypercore/hyperdrive...so my current assumption is that the hyper protocol is the web browser effort, rather than the other parts of infrastructure.
Are there incompatibilities in both protocols, or is hypercore trying to refactor its codebase in order to say, remove legacy dependencies that might not be necessary anymore?
Dat is essentially the old version of Hypercore Protocol (aka "Hyper"). We had to make a set of breaking changes and decided to rebrand. So- if you're looking for a base, use the hyper stuff.
The changes included a switch to a DHT for peer lookup and data-structure reworks that improved speed and scaling by quite a bit.
Does the new cli allow multiwriting yet? I'd like multiple computers to be able to modify the dataset.
Similarly, it seems the protocol supports keeping around all versions of the hyperdrive, but it looks like that isnt made available via the cli. Is this correct?
That sounds cool. Would you mind telling a bit more about your Setup? How does it compare to Syncthing? I'm still looking for a p2p file sync solution that can mix locally and remotely stored files (and actually works at all).
I dont know syncthing so cant comment on that. My setup is very simple. `dat share` running on my laptop and `dat <hash>` on all of my servers. The servers are running inside docker/k8s in a daemonset so every server in the cluster has a recent version of the files. Some benefits of this over say a shared NFS mount hosted in the cluster:
- I can use my local editor online or offline in the same config
- my editor doesnt hang as it saves over a network connection
- live reload apps running on the servers will pickup the changes immediately. inotify doesnt work on nfs mounted drives.
One main disadvantage in some cases is it doesnt currently support multi-writer. the servers mounting the dat are read only. I'm pretty sure this is changing in newer versions, but I'm not sure on roadmap.
The tech is solid, there is no doubt. However, Despite following Dat since about 6 years ago, the general organization has always been lacking. I took a break from following it and all “tech” for all of 2020 as I needed a breather and focused on other offline interests. The new Beaker Browser v1 was finally released which was a big milestone and this news hit my radar. It inspired me to delve back in to this particular technology and went down the hyper rabbit hole. I read about the change from Dat to Hyper and spent the last week or so trying to grok all the reorg and docs. It’s dizzying. I’m both very interested in using this tech again and sadly also frustrated with the org issues which seem not so much better than previous but just different.
On a high level, there are improvements from rebrand to abstraction and generalization of components to allow for a better path forward. I can def see a developer who never followed Dat having a lot of problems grasping Hyper Protocol without investing a lot of upfront time... since it’s been difficult for me who has used Dat pretty extensively and followed all things Dat prior to 2020.
I think they are going about this the right way, though, and it will just take time for it all to get calibrated and be more cohesive. I’m sure they will take all the help they can get so hopefully more people will get involved because the tech has def improved a lot and IMO is a big deal in p2p.
I've never heard of any of these tools, but they all individually sound very useful and cool. This is one of those projects that would probably benefit from a technical writer and marketing person.
Only one person using one device is allowed to add to the log. Each entry contains the hash of the entry before it (including that entry's field where it contains the previous hash, etc.). If you add entries that have a conflicting ordering, you're assumed to be a bad actor and get blocked by the network.
If you want to have multiple devices or multiple people editing the same data, you need to give them all a separate log and use some kind of CRDT system to combine them
It's pretty nice to see that there are now several P2P techs.
This one seems pretty good, but unless you have a single client and unless the protocol is defined to do a single thing and do it well, I can't see it thrive.
For example bittorrent is a good tech because it only does one thing and users can understand it. Same for IPFS. Softwares like limewire or kazaa thrived because they were simple enough to use. Same goes for protocols.
In my view, decentralization requires:
* a client that runs on user's hardware
* a p2p database and filesystem, with this kind of append-only log and verification, that runs on the client
* server can still accelerate access access time
Generally with this kind of system, the goal is that each user can publish data/content either in a public/private manner, without being dependent on servers.
The only problem is being able to attract users and compete with popular platforms. If such a protocol+software can attract users who want a "privacy-aware" alternative to platforms like facebook, I could see it work.
Sometimes I wonder why there isn't more money invested to build such platforms, because the p2p internet seems to be the only viable solution to fight privacy issues.
Yes and no. The tech is like the “Internet” and is largely born from academic work and continues forward as fundamental and foundational p2p networking advancements. It’s not a consumer product itself but obviously builders have and will fill that void. Beaker is the best example but eventually more and some projects you may have heard about are actually using some aspects of hypercore.
All of the decentralization/P2P companies I have seen focus on platform before product, seemingly with the attitude of “if we build it, developers will come”. Has this worked out for any of them so far?
Regardless, the companies do great R&D, so I’m glad that people are investing in them.
Are there any companies building product first that also happen to be P2P in nature?
We're pretty product-focused. We started with Beaker Browser, then recently have been working on CLI tools for private file sharing [1] and now we're working toward an app with the same goal [2]
How is this patreon thing going for you? dat is a pretty neat protocol, but it seems to have trouble coming out of its niche. Are there any products (as in $$$) coming out of the community?
Or maybe some bigger sponsors (think decentralized web)?
We started it this month and are at $150/mo, not a bad start and we appreciate the help. We're trying it as a way to bridge us to the next step.
I love Beaker but I think it was strategically too long-term. The hyp CLI and uplink are us turning our focus to better immediate utility for people, which we think is the right near-term strategy to breaking out of the niche.
I don't think this is doable yet. To effectively do decentralized/p2p web in a mainstream browser, you'd need (some of) the features mentioned here - https://github.com/mozilla/libdweb
How so? As I understand it, WebRTC gives you the ability to do a p2p web, you just need a bootstrap server to load scripts and do STUN. WebTorrent has DHTs working in the browser, not sure how much more decentralized you're talking about.
> WebTorrent has DHTs working in the browser, not sure how much more decentralized you're talking about.
Not sure about how things work now, but IIRC WebTorrent web peers could not connect to native BitTorrent peers. Instead they would connect only to WebTorrent Desktop peers and other Web peers. The WebTorrent team of course is doing incredible work, and they've been able to convince some of the native torrent clients to support Web Torrent - so it kinda works in practice.
So for a new protocol, the hard part would be to make all the non-browser peers speak WebRTC - which is nowhere as easy/robust as writing a TCP/UDP based peer [1].
dat/hypercore from what I've seen has been a lot better at that than others, e.g. with the initial work in a scientific dataset use case, beaker browser, ...
I've recently installed beaker browser to have a look and poke around hypercore.
Unfortunately, so far I couldn't load a single page from https://explore.beakerbrowser.com/ - even though those pages are supposed to be kept online thanks to hashbase.io
Such a pity, I'd really like to see something like this take off!
It would be cool to run one of those P2P technologies over a protocol like PJON: https://github.com/gioblu/PJON until the network infrastructure is centralized P2P is relatively useless, it would be cool to see it applied on top of a private network infrastructure made by people.
A related thing I'd love to see take off is companies offering log-based interactions instead of HTTP APIs only. Some apps are such a poor fit for HTTP that you end up with a convoluted mess of web-hooks as soon as some elements of async appear. Obviously, these web-hook contraptions are always home-grown and offer nothing near what you get with a proper log system.
Hmm, Somehow our networking tech has this big gap in-between LANs and Clouds ...
I think there is more between LANs and Clouds, e.g. encrypted emails, or VPN tunnels which could be set up between sender and recipient, last but not least HTTPS or SFTP access to documents (which could still be encrypted Zip files. You'd have to share keys/secrets, but you'll need that for secure P2P connections anyways.
Sure, but I think it can be easier to do. VPNs require some setup so it's not as ad-hoc as you want for sharing files between two offices. The HTTPS server is what many offices do now when they deal with a lot of secure document exchange to clients (law firms, accountants, etc). Again, not bad but requires them to setup an on-prem public server to have equivalent security to peer-to-peer. The opportunity is to take some steps out of the process.
The Dat people are notoriously opposed to blockchain hype. That doesn't mean you should actually go build anything on top of their projects (for reasons to do with what the top post in this thread touch on—they don't know much about stability or focus).
Not warranted in my experience. I've never heard anyone in the community (except perhaps...er..me) talk about integration between Dat and a blockchain.
The example you gave, Megaupload, wasn't a peer to peer blockchain buzzword bingo site, it was just a plain old Web 2.0 era website with a founder who turned a blind eye (or perhaps encouraged) pirated media to be uploaded. Google Kim Dotcom if you're interested in his story.
He hasn't done any jail time, despite the US DOJ's attempts to extradite him, although that really leads to the question as to whether jail is an appropriate punishment for copyright infringement.
The hyper* world seems to be very fragmented right now. There is the dat [0] project, which started 2013 and shares files between computers p2p. In may 2020 the dat protocol was renamed to the hypercore protocol [1] and dat "will continue as the collective of teams and projects that have evolved from the original Dat CLI project.". hypercore-protocol [2] links to multiple applications for file sharing (none of them the dat CLI tool).
Hyperdrive [3] "help[s] you share files quickly and safely, directly from your computer [...] -- all P2P". The github [4] mentions the hyperdrive daemon als a batteries included experience, but the hyperdrive daemon has a deprecation notice and tells you to use hyperspace.
Hyperspace [5] "provides remote access to Hypercores and a Hyperswarm instance" and "exposes a simple RPC interface". The documentation is very technical and seems to be aimed at developers, not using it as a tool.
Digging around in the github organisation [6] or by stumbling upon the patreon [7] one can find the hyp CLI tool which is "A CLI for peer-to-peer file sharing (and more) using the Hypercore Protocol". The first commit is 9 days old. On the twitter of the author one can also find hyperbeam [8], which is integrated into hyp.
Here on HN one can also find an announcement for "uplink" [9].
The tech looks pretty cool, but the vast amount of different projects makes it difficult to grasp. From all of these tools the dat tool seems to be the most advanced, but not actively maintained. It's not linked on any of the hyper* sites and doesn't seem to be the recommended way to use the hypercore protocol to share files p2p. While I would like to use the tech, I'm pretty lost on how to do that today.