Hivemind is Live!

in #hivemind6 years ago

Hivemind Live.jpg

Hello Steemians, we are excited to announce that Steemit’s APIs are now powered by Hivemind! By routing most of the social APIs to Hivemind, we are able to dramatically shrink the size of the full nodes we run. Because there are so many social applications sharing information on Steem, full nodes had become extremely large and expensive to run especially because they are not optimized for use cases that are not consensus-critical.

Hivemind is a service that syncs a traditional database with the blockchain. An added benefit of Hivemind is that since it’s written in Python, developers can modify its behavior and APIs with ease, instead of relying on C++ developers and modifying code which can interact with critical consensus logic. Implementing features like unreblogging becomes trivial with Hivemind while simultaneously enabling node operators to dramatically shrink the size (and cost) of their servers.

Testing

In our tests we were able to route all api.steem.com APIs to the new configuration—Hivemind for social APIs and everything else to “slimmed down” steemd nodes—which performed well. In addition, these nodes were running on-disk, as opposed to RAM; a major win which was only made possible thanks to AppBase.

We are extremely excited to see Hivemind running in production, on-disk, and utilizing AppBase. After completing this task, we have decreased our steemd instance sizes from 488GB RAM to 61GB RAM instances. We are doing further analysis and may be able to decrease this usage even further. These are major improvements to Steem that will make it easier and more cost efficient for anyone to run a full node, and all without requiring a hardfork.

Bug Hunting

We have so far identified one minor stability issue and will be rolling out an update shortly. If you are experiencing any issues using Steemit.com please leave a note in the comments.

The Steemit Team

Sort:  
There are 2 pages
Pages

Awesome. Great job delivering this guys! I was very glad to see this route taken as opposed to just 'sun-setting' the most popular and widely used app in all of crypto :)

This is fantastic work! I wondered whether you’d really be able to get RAM requirements down below 64Gb, but you’ve done it.
Next question: who wants to help me set up a full node. I have the hardware to meet the 2x64Gb + 1 x 32Gb requirements and have very fast internet 300/150 Mb/s.

Posted using Partiko iOS

@apshamiton
please contact me, @surfyogi on telegram messenger

A full node with only 61gb of ram is the dream ! Congrats team ! Can you guys share more specs about your instances ? How much disk, ssd or nvme, which cpu etc ?

We run two types of Steem nodes - one that handles account history, and one that handles everything else except for the tags and follows plugins (because Hivemind handles these plugins itself now). Both types of nodes run on instances in AWS that have 61GB RAM and a single physically attached nVME drive. The shared memory files are stored only on the physically attached drive, no longer in RAM as was previously required. Hivemind itself uses a postgres database in AWS RDS on instances with 32GB RAM. Hivemind's 'app servers' use much smaller instances with only 4GB RAM. All of this has been a phase in our plans to streamline infrastructure for cost effectiveness and there may be further improvements still yet to be made.

So, how many moving parts has Hivemid API instance?

  • app server
  • postgres
  • api node

right?

Yes, hivemind requires a postgres database. You need a steemd node or nodes that contain all plugins except for tags/follows. We use jussi for routing, which is a custom reverse proxy and caching layer.

Everything has official docker images available and could be set up to run together with docker-compose.

Well, I’m not a developer guy, I just wanna figure out how much you save on old rpc node vs new set of nodes delta.

With the cost of instances with lots of RAM being very high and fast disk being relatively cheap, quite a lot.

How much 1Gb RAM costs now?

It's not really a full node for only 61GB of ram, it's each node is only 61GB (there are multiple (3) that make up a full node).

Two types of Steem nodes, not three. See below reply to @howo :)

Servers/nodes same thing :)

Just wanted to be clear it wasn’t a single 61gb server making up a full node.

So it kinda like sharding?

Kind of but not really. In sharding you have the same tech but split into chunks.

With this you are splitting into chunks but using different tech (steemd, hivemind, rocksdb) in each section. L

What is rocksdb? I saw mention of postgres above...

Full nodes used to run with all data stored in ram. Rocksdb is a disk database that was added to recent versions of steem to allow account history to be stored on disk thus reducing memory requirements in half.

HiveMind uses a Postgres database.

5 comments in the first 10 minutes... This place is lively... 🕸️🍃

I ran into the gateway error when trying to load steemit for some minutes after the post went up. Could be why.

We've missed you. Things have been happening as you can see. Lots of drama, but some of us are still having fun.

Nice to see you @beanz! :D

Howdy Stranger.

Don't worry, it's been very lively around here in the past few weeks.

Posted using Partiko Android

It only looks lively for people who haven't been around lately :-)

Shhh everyone! Beanz is back.

I hope you are doing well and all things are well with you. To quote @steevc:

We've missed you...

That is a very positive news to say the least!
We left a note earlier today on a prior update by @steemitdev, whereby we believe one related functionality in api.steemit.com nodes is broken, as follows:

hey there,
We noticed at the time of rollout of changes (as of Jan 10) that the post's rshares data is now returning undefined
Is this an intended behaviour?
We are for example pulling post data using steem.api.getDiscussionsByCreatedand then accessing post's post.vote_rshares
This is only no longer working on api.steemit.com and api.steemitdev.com
Would be great if you can provide some insights or look into this.
Thanks!

Yes I am also very pleased and want to see communities man, i just want subreddits , just something simple like chainbb, when it finally happens and we have an instant reliable faucet for creating new accounts then steem can finally just start growing organically on its own JUST from all the reddit users who will Migrate over just for the censorship resistance AND the payouts! In fact I think the censorship question will be enough to bring Millions of users over since that is what happened when Digg users first migrated to Reddit and then reddit users migrated to Voat and gab.ai etc

We are some of the first wave of reddit to steem immigrants and we are like the paratroopers who were dropped in over enemy lines into Normandy to capture strategic bridges etc and soon we will have a Normandy type invasion from the Bitcoin and crypto subreddits first into steem and then all sorts of other subreddits like r/trees r/TheDonald and all sorts of political subreddits, photography subreddits looking to get some money, meme crafters who post normall on reddit and 4chan for nopthing will find Dmania and take it over and also build their own versions to monetize memetics

its going to be a bright future for steem, we have riden out the bear market and its actually made us much stronger

my god I hope we don't get attack by the redit crowd....
they are nasty at times....
LOL

My guess is that this is included inside the minor update, but upvoted for visibility.

Based on @roadscape's comment on the issue, I don't think this will be addressed.

Edit: Looks like I misread the problem.

Are you referring to rshares per vote, or per post? If the latter, the net_rshares field is the canonical source.

Yes the per post one.
So are you saying the value of vote_rshares has been dropped, while the net_rshares still has value? I am unable to test this out on api.steemit.com or api.steemitdev.com as I keep getting an RPC error atm

Try again, there was a small hiccup ;)

works now, and yes net_rshares is returning value, thank you.

unreblogging

This doesn't sound right. :D

This is a great update, though. Thanks for all the work on Hivemind, that sound like some major size decrease!

dereblogging?

de/unsteeming :P

I think that upvote ===> un-vote.
Resteem ==> un-resteem.
So it is un-resteeming.

Lol

I like unsteeming

Dis-ReSteeen

Discombobulate?
Antidisestablishmentarianism?

I could wrap my head around deresteeming. I think.

Is that what Secretary Clinton did? I gotta learn how to do that kind of high tech stuff.

That's probably why it was, rarely, a bit funky using Steemit.com in the last few days, but anyway: Great job!

488GB RAM to 61GB on-Disk is a huge improvement!

You still need multiple servers to make up a full node, but with this change, you are able to use 64GB instances. (Minimum of 3 servers like this to make up a full node).

To mimic our setup on a smaller scale, it would be 2 x 61GB servers and 1 x 32GB for Hivemind's postgres. Not 3 x 64GB.

So does this mean I could run the 2 x 61Gb on a 128Gb HEDT machine and the 1 x 32 Gb on another machine on the same local gigabit Ethernet? Because I have this physical hardware already and would like to run a full node. I also have very fast internet.

Posted using Partiko iOS

With a configuration like this I would run a single node that includes all of the plugins except for tags/follows in one on the machine with 128GB. The downside would be a longer replay/sync time, but it would likely be more performant than running two separate nodes on one machine (due to disk paging). Fast disk would be required, such as nVME SSD. The two machines talking over local gigabit ethernet would work just fine yes.

Thanks. Would it be better to move 64Gb to another computer and run 3 separate machines. How fast does the processor need to be on the 32Gb machine? Is recent Pentium 4560 dual core 4 thread enough for the third machine?

Posted using Partiko iOS

I hope you did not start to sweat because of the funky responses :D

Posted using Partiko Android

After completing this task, we have decreased our steemd instance sizes from 488GB RAM to 61GB RAM instances.

I'm hoping the 61GB RAM is just a typo, and should be "on disk" since they mention above that the full-nodes were on-disk.

Previously it was necessary to keep all state in RAM for acceptable performance. With many of the non-consensus social features now supported in hivemind instead of needing to be directly in Steem, it is now possible to keep all state for Steem on disk. That is why the RAM requirement has been dramatically reduced.

Ah I gotcha. So -- You'd still need about 64GB RAM to run a full-node? Huge improvement, no doubt -- but it's still a bit big....

Or am I getting mixed up with the requirements of Full Nodes and Witness Nodes? Witness nodes are smaller, yeah?

No, you can use 64GB servers, not a single 64GB server.

Yes, the requirements for a witness node are much lower.

If you haven't seen our post about MIRA, you should check that out - that is a feature in development that is aimed to reduce requirements for all nodes, even the smaller consensus nodes (such as witnesses and seeds).

Great work! Hope this will make it feasible for dApps, or collaborations of some projects, to run their own full nodes in the near future:). Certainly a top priority from my point of view. Well delivered!

If you are experiencing any issues using Steemit.com please leave a note in the comments.

Encountered the typical 504 Gateway Error for maybe 5-10 min prior to reading this post (so roughly 40 minutes after it was published) if that is of any help. No other issues experienced yet.

Good that means it is working yeah how can I sleep now🙈 I'm all fired up on this. 🤣😂

Posted using Partiko Android

How’s that “secure wallet” powerdown going?

Posted using Partiko iOS

Actual footage of Steemit HQ and Hivemind coming online

@mstafford, you and your Python skills will be taking over the world..

Python for the win! Gots ta keep it simple!

Sure. Even we economists know some python. Not that weird cpp stuff.

Python is huge for financial scripting!

As long as there is no drop in performance i am happy...

This is a nice development. Good job steem development team.

For the past 10 minutes I have been getting a lot of 501 errors for steemit pages, no problems surfing elsewhere on the net, just steemit. 12:45 pm-12:55 pm Alaska standard time.

These issues should now all be resolved.

hi.... i'm not a programmer and i dont know about hivemind, can someone explain to me how it's affected/ benefit to steemit in understandable language ??

I'd noticed some issues with timeouts at the weekend. I hope you are over that now. Good work.

Steem (busy and Partiko) was sometimes quite uncommunicative over the last few days.
Sometimes not loading at all busy & Partiko.
Sometimes not accepting votes or comments (request too long/ time out)

The Wallet Transfer list is also pretty short.. But that may be intended.

Posted using Partiko Android

I believe busy does not rely on our infrastructure. I am not sure about @partiko

Yeah, that explains a lot. Thanks for the update!

Also, I know a lot of people are concerned about the silence around the power down from Steemit Inc, yourself, and Dan. Would you be willing to comment on it? Because honestly, any communication is better than none. Hell, it'd help with all the FUD getting thrown around last night...

My hope is that @ned is going to clear everything up with his Mission Vision Values strategy-post. (sometime in the very near future)

Here's hoping! There's a lot of questions and not a lot of answers. Even a simple "I got a post on the back burner to talk about it, sit tight" would go a long way.

Even a simple "I got a post on the back burner to talk about it, sit tight" would go a long way.

Exactly!

Since the latest news/experiences around Steemit Inc & Ned were bad (layoff, illness, missed live-stream, ..) and since humans are very good at assuming & imagining things - the imaginations weren't really positive; rather the opposite.

i really don't get it. i do understand the pressure and everything, and that you don't want to talk to people but postponing and then missing a live-stream and not writing a single word for a month in this times where you can write a post from your phone while shi**ng it is legitime to assume that he died.

Precisely. Dispel the FUD, calm the masses.

spoiler alert :(

He talked about it in the recent PAL radio show.

Ned is a very strange guy so we can never know what to expect.
Ive actually commented that hes autistic after the Aggroed discussion the other day. Whoops. 😂
We are currently in very weird territory so i hope things will get cleared up fast.

That’s what I’m talkin’ about! Progressing! Now let’s 10x baby!!! 💪🏼🙌🏼💯

Posted using Partiko iOS

10X, you must have just read a Grant Cardone Book XD

Grant gave me one personally 😂

Posted using Partiko iOS

Great to hear this. Does this mean that hivemind functionality will be available through steem public nodes and will soon be implemented in the documentation ?

OK,,,does this do anything for increasing value of $STEEM? Or bring a larger adoption by the un-tech among the rest of us?

Not so much!
The people that can comprehend what is being said here, are largely already invested in Steem ;-)

Woohoo! Congrats. Sounds like we just got upgraded

I Love it even though I could not understand what it all means I can still the bright spirit emanating from this news @steemitblog
God bless your works.

Whoooooowwww @exyle check this out... I'm gonna rent me another vps to run a full node 🙈 I have to work tomorrow o no.. great... I order it tomorrow. I just reacted on your post 🤣😂🤣😂 yeah! Good job!

Posted using Partiko Android

Even though I don't understand a thing, I'm happy for the updates. Keep them coming!

Hurrah! That's great news. I thought there was the odd error here and there and figured something was going on under the hood.

It's actually good knowing that something is going on under the hood ;O)

Hivemind is ready that is huge news!
Does that also mean that the long awaited community feature will be available soon?
Great update guys!!!

Posted using Partiko Android

The communites feature is a priority in the new Mission, Vision, and Values that can be seen on the about page. Also see Ned's post from earlier today.

NO REAL INFO on the About page bro ;-)
But thanks for all your comments, it's much clearer to me now...
I am missing the "How to become a witness" link, and the page the specs the future job of witnesses, now that it does not require a lot of money to run a node, I wonder if that might remap the witness roster, as far as WHO are really willing to be the most professional witnesses?
AND what does that really mean; should witnesses be responsible for promoting the network as well?

Pieces are falling together!
Yep, there is this bug when you check a wallet - you can't go back to your blog page/ or any other by clicking buttons (reload is the only way).

Cheers!

I'm a victim of this to.

Thanks for the report, fix incoming!

Great Scott! Been waiting to use that one for a while! Lol 😂 really hope this does bring some improvements coz hot damn the moral in this place is lower than the current price

Posted using Partiko iOS

This is actually wonderful news. Great work guys.

Probably the best news we had from you guys...for a while

Holy cow, that is quite the reduction! Thanks for sharing this with us.

I have been encountering annoying problems during the use of Steemit.

First of all, I do not see all the posts browsing the given tag, even in the feed tab I do not see them! I checked on busy and steempeak - there were no problems.

Secondly, by going for example on the wallet tab and going down to see the transfers, instead of them I see the first part of the wallet (balance, savings, etc.) is shown to me 100 times and then the history of transfers. The same occurs when I click on the curator or author's rewards. I see my wallet 100 times, then rewards. And this is not a fault of the device, because I checked the steemit on my main computer, laptop, tablet and two phones - everywhere the page is bugging. For now, these are the only mistakes I have noticed but very annoying. I hope that you will be able to fix it in the near future.

Thank you @steemitblog :)

Posted using Partiko Android

I haven't been able to reproduce this on your profile. Does it persist?

Unfortunately...yes
I even tried to delete cache and change internet browser. Same :(

Posted using Partiko Android

Identified the bug, fix to be deployed shortly. Thanks!

No problem Sir! Have a great day :)

Posted using Partiko Android

Haha. Yeah. Exactly the same thing. Multiple copies of the wallet when scrolling down.

I am confused: Wasn't Hivemind supposed to be the long-awaited community-account feature?

Hivemind is the software that can make the communities feature possible. It is now running and serving content for steemit.com, which is a major accomplishment. This means that the communities feature could be built and implemented - along with other types of social features that were previously non-trivial to implement at a consensus blockchain level. This opens the doors for much easier and faster development of features that every day users can actually see.

As an added bonus, it is also more cost effective to run than to soley rely on the blockchain for non-consensus data.

Does that mean we will be getting communities feature sometime in near future?

The communites feature is a priority in the new Mission, Vision, and Values that can be seen on the about page. Also see Ned's post from earlier today.

Ok, thanks for explanation.

I'm starting to get somewhere with my graphql api that uses hivemind as a backend. If anyone wants to check it out I've kept the developer playground open at https://steem-graphql.jakerawsthorne.co.uk. Let me know if anything breaks. So far you can query posts and accounts

While this is great news and essential, when is Steemit Inc going to communicate with the community about the massive powering down from Steemit owned accounts as well as the personal account of @ned ?

This blog post makes me feel a little better that STINC isn't making a bank run and abandoning the community, but still, without an answer, I can't help but feel the movement of seriously large amounts of STEEM including powering down $8.5m USD worth on the @steemit account.

What is going on?

Hey now, you know what they say... @beggars can't be choosers. You got hivemind, that's something for now. ;)

Hey now, you know what they say... @beggars can't be choosers. You got hivemind, that's something for now. ;)

Hey now, you know what they say... @beggars can't be choosers. You got hivemind, that's something for now. ;)

A hobo and a beggar walk into a bar. 😂😂😂

Nice to see progress. Resteemed :-)

Thanks for an amazing progress. I hope this will get us further in the long run and will help to compete with EOS in terms of feasibility of running your full node.

Long awaiting news of any kind. They seem very positive indeed!

Will go next and see what Ned said in aggroed and llfarms show. Hope he addressed the power down thing more at large.

Loading...

Sometimes I wish Steemit accepted payouts; it would be a good way to have the community really vote with their stakes when changes are made, showing our approval of said changes.

Anyway, hopefully this is the start of many other cost reducing improvements.

Such a ram optimization is very impressive!

Impressive move by the team... Congrats everyone on the platform.

Posted Using - Supersteemian Android Application

Great progress.... Now I need to start learning stuff to understand what this means... lol

regards @paulnulty
When you read the first 20 or more comments, you will learn a lot, a master class of valuable information. We are already two hahaha.
(I use a translator)
Al leer los primeros 20 o más comentarios, aprenderá mucho, toda una clase magistral de información valiosa. Ya somos dos jajaja.
(Utilizo un traductor)

Congratulations @steemitblog!
Your post was mentioned in the Steem Hit Parade in the following category:

  • Comments - Ranked 2 with 95 comments
There are 2 pages
Pages