English 中文(简体)
Riak Post-Commit ook?
原标题:Where are Riak Post-Commit Hooks run?

我试图利用Rik s Post-Commit Hooks来评估如何建立一个分布式的、递增的基于地图的指数,但想知道Rik是否把后Hooks说出来。 他们是否在用于兑现承诺的客户的节点上运行,或者在数据持续存在的主要节点上运行? 如果是后者,我想我从那里能够有效地做地图或减少产出并增加记录。

问题回答

From the docs

Post-commit hooks are run after the write has completed successfully.
Specifically,the hook function is called by riak_kv_put_fsm immediately
before the calling process is notified of the successful write 

riak_kv_put_fsm handles "coordination of Riak PUT requests", so the post commit hook is run on the co-ordinator node, i.e. the node that the client sent the put to.





相关问题
Error in Hadoop MapReduce

When I run a mapreduce program using Hadoop, I get the following error. 10/01/18 10:52:48 INFO mapred.JobClient: Task Id : attempt_201001181020_0002_m_000014_0, Status : FAILED java.io.IOException:...

Error in using Hadoop MapReduce in Eclipse

When I executed a MapReduce program in Eclipse using Hadoop, I got the below error. It has to be some change in path, but I m not able to figure it out. Any idea? 16:35:39 INFO mapred.JobClient: Task ...

Is MapReduce right for me?

I am working on a project that deals with analyzing a very large amount of data, so I discovered MapReduce fairly recently, and before i dive any further into it, i would like to make sure my ...

Hadoop or Hadoop Streaming for MapReduce on AWS

I m about to start a mapreduce project which will run on AWS and I am presented with a choice, to either use Java or C++. I understand that writing the project in Java would make more functionality ...

What default reducers are available in Elastic MapReduce?

I hope I m asking this in the right way. I m learning my way around Elastic MapReduce and I ve seen numerous references to the "Aggregate" reducer that can be used with "Streaming" job flows. In ...

Displaying access log analysis

I m doing some work to analyse the access logs from a Catalyst web application. The data is from the load balancers in front of the web farm and totals about 35Gb per day. It s stored in a Hadoop HDFS ...

热门标签