create account

2nd update of 2025: Hive API stack v1.27.11 officially released by blocktrades

View this thread on: hive.blogpeakd.comecency.com
· @blocktrades ·
$168.91
2nd update of 2025: Hive API stack v1.27.11 officially released
![blocktrades update.png](https://images.hive.blog/DQmSihw8Kz4U7TuCQa98DDdCzqbqPFRumuVWAbareiYZW1Z/blocktrades%20update.png)

Today we officially tagged the latest version of the Hive API node stack (v1.27.11). 

We’ve been running various beta versions of the new stack for quite a while on api.hive.blog, and its been easy to see that it performs much better than the old stack running on most of the other Hive API nodes. The release version of the API is now accessible on https://api.syncad.com and we’ll switch to the release version of the stack on api.hive.blog tomorrow.

With the official release of the new stack, I expect most of the other nodes will be updating to it within the next week or so, and we should see higher performance across the ecosystem.

Here’s a quick summary on some of what the BlockTrades team has been working on since my last report. As usual, it’s not a complete list. Red items are links in gitlab to the actual work done.

## Upgrading everything to build/run on Ubuntu 24

One of the main changes we made across the entire set of apps was to update our build/deployment environment from Ubuntu 22 to Ubuntu 24. This required more work than expected as this also involved an upgrade to a new version of Python, which is heavily used in our testing system and also by several of our development tools, requiring changes to that python code.


## [Hived: blockchain node software](https://gitlab.syncad.com/hive/hive)

We improved the [snapshot/replay processing](https://gitlab.syncad.com/hive/hive/-/merge_requests/1517) so that you can first resume from a snapshot, then replay any additional blocks you have in your current block log, instead of requiring you your node to re-sync those blocks.

### Optimizations In progress

We’re currently finishing up a few long-planned performance improvements: 1) a fixed-size block memory allocator (https://gitlab.syncad.com/hive/hive/-/merge_requests/1525) and 2) a moving comment objects from memory into a rocksdb database. 

While benchmarking the fixed-block allocator, we saw a 18% speedup in in-memory replay time and a much bigger speedup (23%) for disk-based replays. The new allocator also reduces memory usage by 1665 MB.

Moving comment objects from memory also drastically reduced the size of the statefile, which will make it easy for systems with relatively low amounts of memory to do “in-memory” replays. I’ll provide more details on this later after I’ve personally benchmarked the new code, but everything I’ve heard so far sounds quite impressive.

### Upcoming work: enhancing transaction signing 

We still plan an overhaul of the transaction signing system in hived. These changes will be included as part of the next hardfork and they are also tightly related to the support for “Lite Accounts” using a HAF API (my plan is to offer a similar signing feature set across the two systems to keep things simple). So the Lite Account API will probably be rolled out on a similar time frame.

## [HAF: framework for creating new Hive APIs and apps](https://gitlab.syncad.com/hive/haf)

* We now store the state history of a HAF node inside its database (e.g. when it switches between massive sync, live sync, etc): https://gitlab.syncad.com/hive/haf/-/merge_requests/595 
* Documentation updates: https://gitlab.syncad.com/hive/haf/-/merge_requests/629 https://gitlab.syncad.com/hive/haf/-/merge_requests/630

### Upcoming work: lightweight HAF servers

Plans to support an alternate “lightweight” version of HAF with pruned block data: https://gitlab.syncad.com/hive/haf/-/issues/277

## [HAfAH: account history API](https://gitlab.syncad.com/hive/HAfAH)

We switched to using structured parameter/return value types for the new REST API. Only a few apps currently use the new API, and these apps (e.g. Denser and the HAF block explorer UI) have been upgraded to use the newer version of the API.

## [Hivemind: social media API](https://gitlab.syncad.com/hive/hivemind)

There was a fix to hivemind’s healthcheck due to an update to HAF’s schema: https://gitlab.syncad.com/hive/hivemind/-/merge_requests/867

### Optimized notification cache processing to reduce block processing time
We optimized notification cache processing. Previously this code had to process the last 90 days worth of blocks (e.g. around 3m blocks) to generate notifications for users, so it consumed the vast majority of the time that hivemind needed to update on each block during live sync (it took around 500ms on a fast system). Now we incrementally update this cache on a block-by-block basis and it is 100x faster (around 5ms). Total time now for the hivemind indexer to process a block is down to a comfortable 50ms. There are a few minor issues with the new code, which we’ll resolve in a later release.


### Redesigned follow-style tables to be more efficient

We also redesigned the tables and queries for managing follows, mutes, blacklists, etc. This not only reduced storage requirements, but more importantly, allows for the performance of queries related to these tables to scale well over time. In particular, functions that need to skip "muted" information should be much faster.

These changes were done in https://gitlab.syncad.com/hive/hivemind/-/merge_requests/863  https://gitlab.syncad.com/hive/hivemind/-/merge_requests/869 https://gitlab.syncad.com/hive/hivemind/-/merge_requests/872

### Configurable timeout for long API calls

Hivemind also now has a configurable “timeout” on API calls that API node operators can set to auto-kill some pathological queries that might unnecessarily load their server. By default it is set to 5s which should be appropriate for most current servers I think. Very fast servers may consider lowering this value and very slow servers may want to increase it.

## [Balance tracker API: tracks token balance histories for accounts](https://gitlab.syncad.com/hive/balance_tracker)

* Similar to HAFAH, the REST API was modified to support structured parameter and return types.
* We added daily, monthly, and yearly aggregation data for coin balances.
* Further speed ups to sync time.
* Added limits for APIs taking a page size
* Added support for delegation processing
* Track savings balance
* New API for recurrent transfers

## [Reputation tracker: API for fetching account reputation](https://gitlab.syncad.com/hive/reputation_tracker)

* Fixed a problem with upgrading reputation_tracker on a haf server: https://gitlab.syncad.com/hive/reputation_tracker/-/merge_requests/96
* Rewrote reputation calculation algorithm to get a big speedup: https://gitlab.syncad.com/hive/reputation_tracker/-/merge_requests/89

## [HAF Block Explorer](https://gitlab.syncad.com/hive/haf_block_explorer)

* Rewrote the algorithm to speed it up, which also eliminated the need for periodic full vacuums of the tables.
* Rewrote permlink search: https://gitlab.syncad.com/hive/haf_block_explorer/-/merge_requests/301
* Added an index for filtering by post in permlink search: https://gitlab.syncad.com/hive/haf_block_explorer/-/merge_requests/303
* New block API: https://gitlab.syncad.com/hive/haf_block_explorer/-/merge_requests/294
* Unify API return types to use structured types: https://gitlab.syncad.com/hive/haf_block_explorer/-/merge_requests/298
* Unify types in hafbe endpoints: https://gitlab.syncad.com/hive/haf_block_explorer/-/merge_requests/300
* Add max limits to various endpoints: https://gitlab.syncad.com/hive/haf_block_explorer/-/merge_requests/287
* Add blocks index on hash column: https://gitlab.syncad.com/hive/haf_block_explorer/-/merge_requests/282

## [HAF API Node (Scripts for deploying and managing an API node)](https://gitlab.syncad.com/hive/haf_api_node)
For anyone who wants to run a Hive API node server, this is the place to start. This repo contains scripts for managing the required services using docker compose.

* Fixes to assisted startup script: https://gitlab.syncad.com/hive/haf_api_node/-/merge_requests/89
* Various fixes to healthchecks
* There is a separate “hivemind_user” role that is used to allow for timeout of API-based queries (as opposed to indexing queries). As mentioned in the hivemind section, this timeout defaults to 5s.
* We now use “haf” prefix by default instead of “haf-world” to shorten container names.
* Tempfiles under 200 bytes aren’t logged to reduce log spam
* More database tuning settings were made based on analysis using pg_gather. In particular, work_mem was reduced from 1024MB to 64MB, which should reduce the chance for an OOM condition on a heavily loaded server.

## [WAX API library for Hive apps](https://gitlab.syncad.com/hive/wax)

* Support for external signature providers (like Keychain) in transaction creation process supported by Wax
* Eliminated issues reported by dependabot service specific to dependency and code based vulnerabilities
* Implemented support for MetaMask as signature provider extension. We are waiting for security audit verification (of dedicated MetaMask snap implementation supporting Hive integration) to make Hive officially supported by MetaMask. Also Hive has been included in https://github.com/satoshilabs/slips/blob/master/slip-0044.md
* Improving error information available to applications when API calls fail. First step is mostly done in Hived repo: generation of constants representing specific FC_ASSERT instances. After that, exception classes in WAX will wrap the most common error cases and then expose them to Python/TS to simplify error processing (currently complex regexp parsing is required on the client side to detect some types of errors).
* Improvements to support workerbee better (bot library).
* First working version of Python implementation. API support is still in progress, but we expect to have our first prototype for automating generation of the API call definitions for Python from the swagger.json file (same as it currently works for TypeScript) by next week.


## [Hive Wallet MetaMask Snap](https://gitlab.syncad.com/hive/metamask-snap)

This is a hive wallet extension allowing you to sign transactions using keys derived from your MetaMask wallet. 

We are currently preparing the library for an official security audit by improving project documentation and fixing issues after an internal review.

## [Hive Bridge](https://auth.openhive.network/)

To make joining Hive more smoothly, we created a Hive Bridge service providing basic features such as signing, encrypting (can be used in bot authentication flows where given user need to confirm its authority by encrypting some provided buffer). The service is available at: https://auth.openhive.network


## [WorkerBee](https://gitlab.syncad.com/hive/workerbee)

This is a typescript library for automated Hive-related tasks (e.g. for writing bots that process incoming blocks). We recently made performance optimizations to workerbee to support large scale block processing scenarios where lots of previous blocks need to be fetched. Recent work has speed this up to 3x. We’re continuing to work on further optimizations to eliminate bottlenecks.

# [Generic Healthchecker UI component](https://gitlab.syncad.com/hive/healthchecker-component)

We officially released the healthchecker UI component. This component can be installed into a Hive web app to allow a user to monitor the performance of API nodes available to them and control which API node is used. The HAF block explorer UI was also updated to use this component.

## [Denser](https://gitlab.syncad.com/hive/denser)

* Support for communities
* Support for keychain signing
* Beta version available for testing at https://blog.openhive.network

## [HiveSense: Semantic-based post searching using vector embeddings](https://gitlab.syncad.com/hive/hivesense)

HiveSense is a brand new HAF app to optionally allow for semantic searching of Hivemind data (i.e. Hive posts). This should solve a long-standing problem where it has been difficult for users to find older content, so it should be a very nice improvement to the ecosystem.

It works by running deep learning algorithms to generate vector embeddings for Hive posts. These can then be searched to identify posts that are semantically related (related by meaning rather by than by exactly matching words) to a user’s search term.

This project is still undergoing development and testing right now, but the code is available for public experimentation. The new repo is here: https://gitlab.syncad.com/hive/hivesense
👍  , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , and 936 others
properties (23)
authorblocktrades
permlink2nd-update-of-2025-hive-api-stack-v1-27-11-officially-released
categoryhive-139531
json_metadata{"tags":["hive-139531","hive","blockchain","software","blocktrades"],"image":["https://images.hive.blog/DQmSihw8Kz4U7TuCQa98DDdCzqbqPFRumuVWAbareiYZW1Z/blocktrades%20update.png"],"links":["https://api.syncad.com"],"app":"hiveblog/0.1","format":"markdown"}
created2025-05-01 21:34:39
last_update2025-05-01 21:34:39
depth0
children61
last_payout2025-05-08 21:34:39
cashout_time1969-12-31 23:59:59
total_payout_value84.518 HBD
curator_payout_value84.393 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length12,469
author_reputation1,285,459,734,493,132
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,447,336
net_rshares506,742,731,546,242
author_curate_reward""
vote details (1000)
@acidyo ·
Awesome to see these come to light! 

That last one I know is going to make life easier for many users and myself who always think of a post but give up even trying to search for it.
👎  , , , , , , , , , , ,
properties (23)
authoracidyo
permlinksvlyd7
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 00:22:21
last_update2025-05-02 00:22:21
depth1
children0
last_payout2025-05-09 00:22:21
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length182
author_reputation3,339,446,967,206,920
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,449,921
net_rshares-33,552,499,984
author_curate_reward""
vote details (12)
@aftabirshad ·
You have really done a great job, you should be given a separate crown for that.Believe me, this platform really needs people like you.
properties (22)
authoraftabirshad
permlinksvmm8r
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 08:58:03
last_update2025-05-02 08:58:03
depth1
children0
last_payout2025-05-09 08:58:03
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length135
author_reputation7,983,203,114,225
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,456,690
net_rshares0
@andablackwidow ·
$0.19
Adding my 2c to hived's `Optimizations In progress`.

First one - pool allocators - in principle does not reduce memory usage (at least not at current stage). In fact it increases it somewhat, however due to less fragmentation actual usable memory increases, hence less allocation is reported. It is the same effect we could already get by loading snapshot, except now the benefit persists instead of degrading over time. I'm pretty curious about the performance increase, especially in combination with just finished [undo session optimizations](https://gitlab.syncad.com/hive/hive/-/merge_requests/1526). As far as I know pool allocators are not yet applied to objects created for undo session, so there is still room for improvement (might even show up in performance of regular sync, not just in extreme traffic in live sync and in colony+queen cycle). Also we have detailed data on average amount of objects needed at any given time which might allow some tweaking of pool sizes for each individual index, which might also slightly improve overall performance still. Once the MR is included in `develop` [further task opens up](https://gitlab.syncad.com/hive/hive/-/issues/675#note_159292), but Marek wanted to first get the estimation of how much actual memory consumption reduction we can gain with it, especially considering that the biggest item is getting much smaller with second mentioned optimization - comment archive.

As for "comment archive" optimization, drastic reduction of memory usage is obviously expected, but the first benchmarks look too good to be true. Even the extreme pessimistic scenarios are faster than current version, with most of comment accesses to archive in RocksDB instead of fresh comments inside memory multiindex  (measured on big blocks with over 14k votes in each block). It is not entirely impossible, because RocksDB might be algorithmically faster even though each access to data is potentially much slower, but that would mean it makes sense to take a deep look into type of multiindex we are using, and not just for comments - more opportunities for optimizations :o)
👍  ,
properties (23)
authorandablackwidow
permlinkre-blocktrades-202552t23245824z
categoryhive-139531
json_metadata{"tags":["hive-139531","hive","blockchain","software","blocktrades"],"app":"ecency/3.0.35-surfer","format":"markdown+html"}
created2025-05-02 00:32:45
last_update2025-05-02 00:32:45
depth1
children2
last_payout2025-05-09 00:32:45
cashout_time1969-12-31 23:59:59
total_payout_value0.092 HBD
curator_payout_value0.093 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length2,117
author_reputation84,501,428,946,264
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,450,183
net_rshares561,014,665,269
author_curate_reward""
vote details (2)
@blocktrades ·
$0.21
I don't really agree that there's not a real memory usage reduction. The memory is in fact allocated by the software, and hence isn't available to other programs, regardless of whether it is actively storing data or not. Further, I don't think it makes sense to compare it a snapshot load, because that is at best a temporary thing, not reflective of how the software will behave over time.
👍  , ,
👎  
properties (23)
authorblocktrades
permlinksvm4wa
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 02:43:33
last_update2025-05-02 02:43:33
depth2
children1
last_payout2025-05-09 02:43:33
cashout_time1969-12-31 23:59:59
total_payout_value0.104 HBD
curator_payout_value0.105 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length390
author_reputation1,285,459,734,493,132
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,451,634
net_rshares628,371,693,080
author_curate_reward""
vote details (4)
@andablackwidow ·
![](https://images.ecency.com/DQmecbFfWPsENKBWVNHdDvQqaJCEUyiJAyZNNE14nTzAvZG/technically_correct.jpg)
What I was trying to say is that in optimized version hived **requests** to allocate more memory (because it rounds up to the size of whole pool, even if space might happen to never actually be used later), as opposed to next potential task that focuses on storing the same data in more compact form, thus leading to smaller index nodes and requests to allocate less memory.

> I don't think it makes sense to compare it a snapshot load, because that is at best a temporary thing, not reflective of how the software will behave over time.

Well, the underlying mechanism of why we get lower effective memory allocation is the same - objects are placed better in memory. In case of snapshot load it just happens because objects of the same class are allocated one after another, while pool allocator guarantees such behavior. That's why I said `except now the benefit persists instead of degrading over time`.

---
It is funny how the mind works. I wasn't consciously thinking about it, and yet I think I figured out why voting for old comments, when the data might need to be pulled from disk, might be faster than voting for fresh comments. Now it is kind of obvious, but since I expected reaching for archived comments to be two or three orders of magnitude slower than using multiindex, the results showing it was faster clouded the most likely reason.

When voting for archived comments all hived  needs to do is to determine the target comment exists and then the work ends. On the other hand voting for fresh comments is actual voting, calculating mana to consume, strength of the vote, creation of related state object, updating data in multiple other objects, and then all that work is undone, redone once again during block production, undone again and performed yet again during block application. In case of archived comments it also needs to reach for it three times, but second and third time it reaches into cache. For a fair comparison we'd need to measure just access to comment in isolation from everything else.

If above turns out to be correct explanation of observed behavior, it would put me at ease, because it means there is no danger in using that optimization even on witness nodes, and even for saturated traffic with big blocks. I might still try to implement the in-memory version of archive if only to test whether Mariusz designed the interfaces correctly :o)
properties (22)
authorandablackwidow
permlinkre-blocktrades-202553t13234224z
categoryhive-139531
json_metadata{"tags":["ecency"],"app":"ecency/3.0.35-surfer","format":"markdown+html"}
created2025-05-03 11:23:03
last_update2025-05-03 11:23:03
depth3
children0
last_payout2025-05-10 11:23:03
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length2,493
author_reputation84,501,428,946,264
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,481,363
net_rshares0
@arshadkhan4421 ·
Great efforts and progressive development.
properties (22)
authorarshadkhan4421
permlinkre-blocktrades-202552t141522627z
categoryhive-139531
json_metadata{"type":"comment","tags":["hive-139531","hive","blockchain","software","blocktrades"],"app":"ecency/3.3.2-mobile","format":"markdown+html"}
created2025-05-02 09:15:24
last_update2025-05-02 09:15:24
depth1
children0
last_payout2025-05-09 09:15:24
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length42
author_reputation6,470,285,904,002
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,456,915
net_rshares0
@blocktrades ·
$1.70
We benchmarked the new stack on a AMD 9950X3D with two 4TB T700 nvme drives and 64GB of RAM. HAF itself, using a RAM disk temporarily for the state_file replays in 13.34h. Next, replaying the apps in parallel takes 49h39m (block explorer takes 7h38m, reputation tracker takes 11h20m, and hivemind takes 49h39m). So total time to replay on this quite fast system is 13.34h+49.65=63h = 2days 15hours.
👍  , , , , ,
👎  
properties (23)
authorblocktrades
permlinksvlrsz
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-01 22:00:39
last_update2025-05-01 22:00:39
depth1
children0
last_payout2025-05-08 22:00:39
cashout_time1969-12-31 23:59:59
total_payout_value0.850 HBD
curator_payout_value0.849 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length398
author_reputation1,285,459,734,493,132
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,447,713
net_rshares5,100,190,355,373
author_curate_reward""
vote details (7)
@brightest ·
Great development down here 👍😍
properties (22)
authorbrightest
permlinkre-blocktrades-202556t125010675z
categoryhive-139531
json_metadata{"type":"comment","tags":["hive-139531","hive","blockchain","software","blocktrades"],"app":"ecency/3.3.2-mobile","format":"markdown+html"}
created2025-05-06 11:50:15
last_update2025-05-06 11:50:15
depth1
children0
last_payout2025-05-13 11:50:15
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length30
author_reputation-1,237,290,032,285
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,552,294
net_rshares0
@christybliss ·
We are glad to learn about these developments. Thank you 
properties (22)
authorchristybliss
permlinkre-blocktrades-202552t234294z
categoryhive-139531
json_metadata{"type":"comment","tags":["hive-139531","hive","blockchain","software","blocktrades"],"app":"ecency/3.3.2-mobile","format":"markdown+html"}
created2025-05-02 01:03:45
last_update2025-05-02 01:03:45
depth1
children0
last_payout2025-05-09 01:03:45
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length57
author_reputation45,275,724,110,136
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,450,491
net_rshares0
@cuteboi ·
This is a massive improvement and good updates. Thank you for the effort put in achieving this. You deserve an applaud 👏💪
properties (22)
authorcuteboi
permlinkre-blocktrades-svm52v
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-02 02:47:21
last_update2025-05-02 02:47:21
depth1
children0
last_payout2025-05-09 02:47:21
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length121
author_reputation9,797,196,698,141
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,451,680
net_rshares0
@danzocal ·
!PIZZA
👎  
properties (23)
authordanzocal
permlinkre-blocktrades-svokro
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-03 10:21:24
last_update2025-05-03 10:21:24
depth1
children0
last_payout2025-05-10 10:21:24
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length6
author_reputation12,432,803,008,396
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,480,527
net_rshares-1,856,434,538
author_curate_reward""
vote details (1)
@driipam ·
$0.05
Such cool developments happening! Thanks :)!
👍  
properties (23)
authordriipam
permlinksvn6wr
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 16:24:27
last_update2025-05-02 16:24:27
depth1
children0
last_payout2025-05-09 16:24:27
cashout_time1969-12-31 23:59:59
total_payout_value0.026 HBD
curator_payout_value0.027 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length44
author_reputation6,300,729,464,931
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,464,750
net_rshares167,092,328,467
author_curate_reward""
vote details (1)
@ecoinstant ·
$0.56
https://www.youtube.com/watch?v=0Friu3Thk5Y
👍  
properties (23)
authorecoinstant
permlinkre-blocktrades-svlxh1
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-02 00:03:06
last_update2025-05-02 00:03:06
depth1
children0
last_payout2025-05-09 00:03:06
cashout_time1969-12-31 23:59:59
total_payout_value0.278 HBD
curator_payout_value0.278 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length43
author_reputation848,502,328,353,279
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,449,513
net_rshares1,671,893,436,221
author_curate_reward""
vote details (1)
@hivebuzz ·
Congratulations @blocktrades! Your post has been a top performer on the Hive blockchain and you have been rewarded with this rare badge

<table><tr><td><img src="https://images.hive.blog/60x60/https://hivebuzz.me/badges/toppayoutday.png"></td><td>Post with the highest payout of the day.</td></tr>
</table>

<sub>_You can view your badges on [your board](https://hivebuzz.me/@blocktrades) and compare yourself to others in the [Ranking](https://hivebuzz.me/ranking)_</sub>
<sub>_If you no longer want to receive notifications, reply to this comment with the word_ `STOP`</sub>



**Check out our last posts:**
<table><tr><td><a href="/hive-122221/@hivebuzz/pum-202504-result"><img src="https://images.hive.blog/64x128/https://i.imgur.com/mzwqdSL.png"></a></td><td><a href="/hive-122221/@hivebuzz/pum-202504-result">Hive Power Up Month Challenge - April 2025 Winners List</a></td></tr><tr><td><a href="/hive-122221/@hivebuzz/pum-202505"><img src="https://images.hive.blog/64x128/https://i.imgur.com/M9RD8KS.png"></a></td><td><a href="/hive-122221/@hivebuzz/pum-202505">Be ready for the May edition of the Hive Power Up Month!</a></td></tr><tr><td><a href="/hive-122221/@hivebuzz/pud-202505"><img src="https://images.hive.blog/64x128/https://i.imgur.com/805FIIt.jpg"></a></td><td><a href="/hive-122221/@hivebuzz/pud-202505">Hive Power Up Day - May 1st 2025</a></td></tr></table>
properties (22)
authorhivebuzz
permlinknotify-1746151759
categoryhive-139531
json_metadata{"image":["https://hivebuzz.me/notify.t6.png"]}
created2025-05-02 02:09:18
last_update2025-05-02 02:09:18
depth1
children0
last_payout2025-05-09 02:09:18
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length1,376
author_reputation369,386,296,376,184
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,451,240
net_rshares0
@itsmemic ·
Been following Hive for a while now and this release feels like a huge leap forward. Thanks for keeping us updated with such detailed reports, makes techies like me feel part of the process!
properties (22)
authoritsmemic
permlinkre-blocktrades-svlts0
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-01 22:43:15
last_update2025-05-01 22:43:15
depth1
children0
last_payout2025-05-08 22:43:15
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length190
author_reputation101,733,680,442,125
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,448,328
net_rshares0
@jarvie ·
I wanted to understand this post in a way that was a bit easier for me and others so I asked Gemini for some help... figure this could help someone else so i'm pasting it here: (Obviously if there are any misunderstandings feel free to correct)

----

Okay, Hive users, let's break down this tech update from Blocktrades! Think of this as upgrading the engines and tools that make Hive work behind the scenes.

Here’s the lowdown on what **Hive API stack v1.27.11** means for you:

**The Big News: Hive Just Got Faster!**

* The main point is a **new, improved version of the software (v1.27.11)** that powers many of the Hive websites and apps you use (like PeakD, Ecency, Liketu, Splinterlands, etc.).
* Blocktrades has been testing it, and it's **noticeably faster** than the older versions most services currently run.
* You can already experience this speed boost on `api.hive.blog` (switching soon) and `api.syncad.com`.
* **Expect most Hive apps and websites to feel snappier** in the coming weeks as other node operators (the people running the servers for Hive apps) upgrade to this new version.

**What Else is Cooking? (Key Improvements)**

1.  **Smoother Performance & Less Hiccups:**
    * **Faster Notifications:** A major bottleneck in getting your notifications quickly has been fixed. It's now **100 times faster** on the backend! This means notifications should appear much more promptly in apps that use Hivemind (most social front-ends).
    * **Faster Feeds (Especially with Mutes/Blacklists):** They've improved how follows, mutes, and blacklists are handled. This should make loading feeds and filtering content (like hiding muted posts) much quicker and more efficient.
    * **More Stable Nodes:** Several technical tweaks (like better memory management, database tuning, and timeouts for slow queries) mean the servers running Hive apps should be more stable and less likely to crash or slow down under heavy load.

2.  **Better Ways to Find Content (Coming Soon!):**
    * **Smarter Search (HiveSense):** This is exciting! They're building a new tool called HiveSense that uses AI (vector embeddings) to understand the *meaning* of posts. Instead of just searching for exact keywords, you'll be able to search for topics or ideas, making it much easier to find relevant older content. This could be a game-changer for discovering posts!

3.  **Wallet and Account Improvements:**
    * **MetaMask Integration Coming:** Work is well underway to let you use your **MetaMask wallet** to sign Hive transactions via a "MetaMask Snap." It's waiting for a security audit, but this could make using Hive even easier for those familiar with MetaMask.
    * **Better Balance Tracking:** The system that tracks your token balances is getting upgrades, including daily/monthly/yearly summaries and support for tracking savings and recurring transfers. Apps using this might be able to show you more detailed info.
    * **Easier App Connections:** Improvements to how apps talk to wallets (like Keychain) and handle errors.

4.  **Behind-the-Scenes Foundation Work:**
    * **Modernized Systems:** Everything's been updated to run on the latest operating systems (Ubuntu 24) and software versions (Python), keeping Hive up-to-date.
    * **Faster Node Syncing:** Improvements make it quicker for node operators to start or restart their servers, meaning less downtime and faster recovery if issues occur.
    * **Lite Accounts & Better Signing:** Future work is planned to improve how transactions are signed and introduce "Lite Accounts," making things potentially simpler or more flexible down the road.

**In a Nutshell:**

This update is all about **speed, efficiency, and building for the future**. You should start noticing faster apps and websites soon. Big improvements like smarter search and MetaMask integration are on the horizon. The Blocktrades team (and other Hive developers) are working hard under the hood to make your Hive experience better!
properties (22)
authorjarvie
permlinkre-blocktrades-svn03l
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-02 13:57:24
last_update2025-05-02 13:57:24
depth1
children4
last_payout2025-05-09 13:57:24
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length3,974
author_reputation388,491,264,112,133
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,461,557
net_rshares0
@blocktrades ·
$0.24
Generally correct, but the speedup notifications updates won't be visible to users: the speedup is in the time it takes to build the table data (it used to take around 1/2 second per block, loading the api node server more than it should). But this is just a speedup in updates to the table, not the rate at which the data is fetched from the table (that will be the same speed, so no difference in speed).
👍  
👎  
properties (23)
authorblocktrades
permlinksvn6ke
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 16:17:24
last_update2025-05-02 16:17:24
depth2
children1
last_payout2025-05-09 16:17:24
cashout_time1969-12-31 23:59:59
total_payout_value0.120 HBD
curator_payout_value0.120 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length406
author_reputation1,285,459,734,493,132
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,464,614
net_rshares728,866,400,545
author_curate_reward""
vote details (2)
@jarvie ·
Thanks for the nuance information on that.
properties (22)
authorjarvie
permlinkre-blocktrades-svn705
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6"}
created2025-05-02 16:26:30
last_update2025-05-02 16:26:30
depth3
children0
last_payout2025-05-09 16:26:30
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length42
author_reputation388,491,264,112,133
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,464,769
net_rshares0
@blocktrades ·
$0.04
It's also worth pointing out it's not a complete summary either :-) For example, it doesn't even mention the new health checker component or workerbee at all as far as I can see. It seems like a good summary for a non-dev, but missing a lot of important info for devs.
👍  
properties (23)
authorblocktrades
permlinksvn6ta
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 16:22:24
last_update2025-05-02 16:22:24
depth2
children1
last_payout2025-05-09 16:22:24
cashout_time1969-12-31 23:59:59
total_payout_value0.022 HBD
curator_payout_value0.022 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length268
author_reputation1,285,459,734,493,132
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,464,724
net_rshares136,693,425,479
author_curate_reward""
vote details (1)
@jarvie ·
yes i did give it as an audience non-devs in this case.
But good to point out there is more for devs they'll want to read the full post.
properties (22)
authorjarvie
permlinkre-blocktrades-svn6z7
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-02 16:25:54
last_update2025-05-02 16:25:54
depth3
children0
last_payout2025-05-09 16:25:54
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length137
author_reputation388,491,264,112,133
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,464,765
net_rshares0
@kgakakillerg · (edited)
Hello I just wanted to ask why are you delegating to @buildawhale which is using the power to do wrong 🤔
properties (22)
authorkgakakillerg
permlinksvmdqm
categoryhive-139531
json_metadata{"app":"hiveblog/0.1","users":["buildawhale"]}
created2025-05-02 05:54:21
last_update2025-05-02 05:54:54
depth1
children24
last_payout2025-05-09 05:54:21
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length104
author_reputation560,360,424,953,041
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,454,184
net_rshares0
@solominer ·
properties (23)
authorsolominer
permlinkre-kgakakillerg-svn7go
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6"}
created2025-05-02 16:36:24
last_update2025-05-02 16:36:24
depth2
children23
last_payout2025-05-09 16:36:24
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length19
author_reputation1,838,796,687,570,964
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,464,917
net_rshares54,019,131,498
author_curate_reward""
vote details (10)
@kgakakillerg ·
I didn't know I mentioned you 😂 ok are you the owner of @blocktrades @buildawhale or @themarkymark

All I'm going to say is look at the comments in this post https://hive.blog/hive-135178/@kgakakillerg/a-trip-to-the-tower-of-london-2024-or-a-walk-around-the-tower-of-london-2024-part-31

Also @bpcvoter3 has exposed everything so please stop 🛑 

Buildawhale is a big farm and downvoter 🤔 so what good are they doing please explain 

I've also heard alot about you 😂
properties (22)
authorkgakakillerg
permlinksvo6nw
categoryhive-139531
json_metadata{"users":["blocktrades","buildawhale","themarkymark","bpcvoter3"],"links":["https://hive.blog/hive-135178/@kgakakillerg/a-trip-to-the-tower-of-london-2024-or-a-walk-around-the-tower-of-london-2024-part-31"],"app":"hiveblog/0.1"}
created2025-05-03 05:16:42
last_update2025-05-03 05:16:42
depth3
children22
last_payout2025-05-10 05:16:42
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length465
author_reputation560,360,424,953,041
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,477,305
net_rshares0
@kingsleyy ·
Hello, good day @blocktrades,

Is there a way to get help from this account/project?

Okay, from the look of things, it seems almost impossible to do that because it’s obvious this account is meant for a project. However, there’s no harm in trying, as I’ve seen this account being supportive of the growth of the Blockchain.

We are humbly seeking support for [The Comedy Club Community](https://peakd.com/c/hive-115325/created). It’s a newly created community designed to fill in the missing piece of laughter on the Blockchain. The main aim is exactly that—with no external intentions.

To make this soliciting request brief,
Please help us in any capacity—perhaps through HP delegation or by extending your hand in curation support for entries to our contests, under your terms and conditions.

Sorry, I know this is supposed to be a private chat, but I thought it would be better (as an attempt) to make this request here. Maybe we could continue on Discord if you're okay with that. Also, kindly forgive my poor manner of making this request.

Many thanks!🙌


properties (22)
authorkingsleyy
permlinkre-blocktrades-202574t5466141z
categoryhive-139531
json_metadata{"tags":["hive-139531","hive","blockchain","software","blocktrades"],"app":"ecency/4.2.0-vision","format":"markdown+html"}
created2025-07-04 04:46:09
last_update2025-07-04 04:46:09
depth1
children0
last_payout2025-07-11 04:46:09
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length1,066
author_reputation308,685,131,169,896
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id143,742,965
net_rshares0
@luthis26 ·
Wow! This is an incredibly comprehensive update. The dedication of the Blocktrades team to improving Hive is really impressive. Looking forward to seeing these changes propagate across the network. Thank you for the detailed report.
properties (22)
authorluthis26
permlinksvmfo9
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 06:36:09
last_update2025-05-02 06:36:09
depth1
children0
last_payout2025-05-09 06:36:09
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length232
author_reputation10,323,624,231,180
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,454,791
net_rshares0
@mfontom ·
Thank you for the update. It will help us greatly as we engage here 
properties (22)
authormfontom
permlinkre-blocktrades-202552t14534445z
categoryhive-139531
json_metadata{"type":"comment","tags":["hive-139531","hive","blockchain","software","blocktrades"],"app":"ecency/3.3.2-mobile","format":"markdown+html"}
created2025-05-02 00:45:36
last_update2025-05-02 00:45:36
depth1
children0
last_payout2025-05-09 00:45:36
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length68
author_reputation61,649,696,266,532
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,450,327
net_rshares0
@pizzabot · (edited)
RE: 2nd update of 2025: Hive API stack v1.27.11 officially released
<center>PIZZA!


$PIZZA slices delivered:
@danzocal<sub>(2/10)</sub> tipped @blocktrades 
sirsmokesalot96 tipped blocktrades 


<sub>Come get [MOON](https://moon.hive.pizza)ed!</sub></center>
properties (22)
authorpizzabot
permlinkre-2nd-update-of-2025-hive-api-stack-v1-27-11-officially-released-20250502t041032z
categoryhive-139531
json_metadata"{"app": "pizzabot"}"
created2025-05-02 04:10:33
last_update2025-05-03 10:22:06
depth1
children0
last_payout2025-05-09 04:10:33
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length191
author_reputation7,561,244,381,350
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,452,660
net_rshares0
@raymondelaparra ·
Congratulations @blocktrades
properties (22)
authorraymondelaparra
permlinkswwclo
categoryhive-139531
json_metadata{"users":["blocktrades"],"app":"hiveblog/0.1"}
created2025-05-27 01:39:27
last_update2025-05-27 01:39:27
depth1
children0
last_payout2025-06-03 01:39:27
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length28
author_reputation70,069,414,383,329
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,985,649
net_rshares0
@sg-gardener ·
Congratulations for 1000 vote .. sir please  help me //Build a whales 🐋  why unvote me 
properties (22)
authorsg-gardener
permlinkre-blocktrades-2025517t23712198z
categoryhive-139531
json_metadata{"type":"comment","tags":["hive-139531","hive","blockchain","software","blocktrades"],"app":"ecency/3.3.2-mobile","format":"markdown+html"}
created2025-05-17 15:07:12
last_update2025-05-17 15:07:12
depth1
children0
last_payout2025-05-24 15:07:12
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length87
author_reputation371,543,844,465
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,788,277
net_rshares0
@sg-gardener ·
Sir your journey is so smooth for us
properties (22)
authorsg-gardener
permlinkswc5vj
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-16 04:02:09
last_update2025-05-16 04:02:09
depth1
children0
last_payout2025-05-23 04:02:09
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length36
author_reputation371,543,844,465
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries
0.
accountdbuzz
weight500
max_accepted_payout1,000,000.000 HBD
percent_hbd0
post_id142,760,793
net_rshares0
@sirsmokesalot96 ·
!PIZZA
properties (22)
authorsirsmokesalot96
permlinksvm8wv
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 04:10:09
last_update2025-05-02 04:10:09
depth1
children0
last_payout2025-05-09 04:10:09
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length6
author_reputation15,804,306,027,996
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,452,656
net_rshares0
@solominer ·
We have well over a hundred active Hive witnesses, but only around 10 active API nodes.

Should we have more API nodes?
👎  , , , , , , ,
properties (23)
authorsolominer
permlinkre-blocktrades-svn7fc
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-02 16:35:36
last_update2025-05-02 16:35:36
depth1
children1
last_payout2025-05-09 16:35:36
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length119
author_reputation1,838,796,687,570,964
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,464,907
net_rshares-7,075,284,555
author_curate_reward""
vote details (8)
@blocktrades ·
I think around 10 is enough for now. With the new stack, any single one of them can easily handle all of the current traffic load of Hive, so adding lots at the moment is just overkill in terms of resource usage. Right now, the best thing is to have them spread out across the globe so that Hiveans can get Hive data faster wherever they are located.
properties (22)
authorblocktrades
permlinksvnar1
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 17:47:36
last_update2025-05-02 17:47:36
depth2
children0
last_payout2025-05-09 17:47:36
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length350
author_reputation1,285,459,734,493,132
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,466,039
net_rshares0
@spiritsurge ·
$0.07
Great set of updates!! Obvious big fan of Denser and Wax. 

Some questions:

a) For Hive Bridge, I see that it is not detecting Metamask extension. But rather it is calling for Metamask Flask in "Requesting Account Creation". Is there a documentation that we can look at for Hive Bridge or is it part of Metamask snap?

b) Is the HealthChecker UI used in any site somewhere? 

Checking the other projects. Well done to the core dev team.
👍  , , ,
properties (23)
authorspiritsurge
permlinksvlrlb
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-01 21:56:00
last_update2025-05-01 21:56:00
depth1
children3
last_payout2025-05-08 21:56:00
cashout_time1969-12-31 23:59:59
total_payout_value0.036 HBD
curator_payout_value0.036 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length437
author_reputation7,531,369,142,603
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,447,634
net_rshares225,668,165,531
author_curate_reward""
vote details (4)
@blocktrades ·
I'll ask about Hive Bridge, I'm not involved in that project directly, so don't know details.

The healthchecker UI is embedded into the latest haf block explorer. You can see it by going to https://testexplore.openhive.network/ and clicking on the API node links at the bottom of the page to bring up the node page.
properties (22)
authorblocktrades
permlinksvlsy9
categoryhive-139531
json_metadata{"links":["https://testexplore.openhive.network/"],"app":"hiveblog/0.1"}
created2025-05-01 22:25:24
last_update2025-05-01 22:25:24
depth2
children0
last_payout2025-05-08 22:25:24
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length316
author_reputation1,285,459,734,493,132
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,448,099
net_rshares0
@blocktrades ·
Regarding metamask flask, I was told: I think it's because we are still not official and for that third party audit is required which is scheduled soon
👍  
properties (23)
authorblocktrades
permlinksvnbcf
categoryhive-139531
json_metadata{"app":"hiveblog/0.1"}
created2025-05-02 18:00:18
last_update2025-05-02 18:00:18
depth2
children1
last_payout2025-05-09 18:00:18
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length151
author_reputation1,285,459,734,493,132
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,466,256
net_rshares1,104,810,990
author_curate_reward""
vote details (1)
@small.minion ·
yes, our team is going through verification process of our mentioned Meta Mask snap. Until it, we can't use official MetaMask distribution channel.
properties (22)
authorsmall.minion
permlinkre-blocktrades-1746313579857
categoryhive-139531
json_metadata{"format":"markdown+html","app":"@hiveio/wax/1.27.6-rc7-stable.250314151849"}
created2025-05-03 23:06:24
last_update2025-05-03 23:06:24
depth3
children0
last_payout2025-05-10 23:06:24
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length148
author_reputation4,971,694,861,032
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,493,663
net_rshares0
@ssjsasha ·
@ssjsasha "one of the only ones who is still here from way ba..."
one of the only ones who is still here from way back... you are a saint sir
properties (22)
authorssjsasha
permlinkre-blocktrades-2s2vbkfmp
categoryhive-139531
json_metadata{"app":"leothreads/0.3","format":"markdown","tags":["leofinance"],"canonical_url":"https://inleo.io/threads/view/ssjsasha/re-blocktrades-2s2vbkfmp","isPoll":false,"pollOptions":{},"dimensions":[]}
created2025-05-02 03:15:12
last_update2025-05-02 03:15:12
depth1
children0
last_payout2025-05-09 03:15:12
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length75
author_reputation300,957,028,934,305
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,452,010
net_rshares0
@steevc ·
It's always good to hear of optimisations as that ought to help Hive scale up. Hive has been pretty stable for a long time now and that is vital. I know you do lots of testing, so I hope that continues.

Keep up the good work.
properties (22)
authorsteevc
permlinkre-blocktrades-svmqju
categoryhive-139531
json_metadata{"tags":["hive-139531"],"app":"peakd/2025.4.6","image":[],"users":[]}
created2025-05-02 10:31:06
last_update2025-05-02 10:31:06
depth1
children0
last_payout2025-05-09 10:31:06
cashout_time1969-12-31 23:59:59
total_payout_value0.000 HBD
curator_payout_value0.000 HBD
pending_payout_value0.000 HBD
promoted0.000 HBD
body_length226
author_reputation1,383,884,522,681,249
root_title"2nd update of 2025: Hive API stack v1.27.11 officially released"
beneficiaries[]
max_accepted_payout1,000,000.000 HBD
percent_hbd10,000
post_id142,457,813
net_rshares0