English 中文(简体)
处理向同一数据集查询的溪流时的有效地图
原标题:Efficient MapReduce when dealing with streams to queries to the same dataset

我有一大批固定的数据集,我的职责是适用。

f 形式为减少(地图(f,数据集),因此,我将使用地图。 然而,我不想按要求分散数据(理想的情况是,我想利用指数化加快 f)。 有一个计划执行,处理这一一般情况?

我先看一下,也许做的是工作,但似乎处理的情况略有不同,而且该代码尚未提供。

最佳回答

Hadoop s MapReduce(以及所有其他由谷歌启发绘制的地图)在任何时候都对数据进行分类。

问题回答

暂无回答




相关问题
OutOfMemoryException on MemoryStream writing

I have a little sample application I was working on trying to get some of the new .Net 4.0 Parallel Extensions going (they are very nice). I m running into a (probably really stupid) problem with an ...

Master-Slave Pattern for Distributed Environment

Currently we have a batch driven process at work which runs every 15 mins and everytime it runs it repeats this cycle several times: Calls a sproc and get some data back from the DB Process the data ...

How to use database server for distributed job scheduling?

I have around 100 computers and few workers on each of them. The already connect to a central database to query for job parameters. Now I have to do job scheduling for them. One job for one worker ...

minimum work size of a goroutine [closed]

Does anyone know approximately what the minimum work size is needed in order for a goroutine to be beneficial (assuming that there are free cores for the work to be offloaded to)?

Optimal number of threads per core

Let s say I have a 4-core CPU, and I want to run some process in the minimum amount of time. The process is ideally parallelizable, so I can run chunks of it on an infinite number of threads and each ...

What s the quickest way to parallelize code?

I have an image processing routine that I believe could be made very parallel very quickly. Each pixel needs to have roughly 2k operations done on it in a way that doesn t depend on the operations ...

how to efficiently apply a medium-weight function in parallel

I m looking to map a modestly-expensive function onto a large lazy seq in parallel. pmap is great but i m loosing to much to context switching. I think I need to increase the size of the chunk of work ...

热门标签