如何将Spark Application作为守护程序运行

编程入门 行业动态 更新时间:2024-10-06 20:29:31
本文介绍了如何将Spark Application作为守护程序运行的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

我有一个有关运行spark应用程序的基本问题.

I have a basic question about running spark application.

我有一个Java客户端,它将向我发送HDFS中驻留的查询数据请求.

I have a Java client which will send me request for query data which is residing in HDFS.

我收到的请求是基于HTTP的REST API,我需要解释该请求并形成Spark SQL查询,并将响应返回给客户端.

The request I get is REST API over HTTP and I need to interpret the request and form Spark SQL queries and return the response back to client.

我无法理解如何使我的spark应用程序成为守护程序,该守护程序正在等待请求并可以使用预实例化的SQL上下文执行查询?

I am unable to understand how can I make my spark application as daemon which is waiting for request and can execute the queries using the pre instantiated SQL context ?

推荐答案

您可以让一个线程在无限循环中运行以使用Spark进行计算.

You can have a thread that run in an infinite loop to do the calculation with Spark.

while (true) { request = incomingQueue.poll() // Process the request with Spark val result = ... outgoingQueue.put(result) }

然后在处理REST请求的线程中,将请求放入incomingQueue中,并等待OutingQueue的结果.

Then in the thread that handle the REST request, you put the request in the incomingQueue and wait for the result from the outgoingQueue.

// Create the request from the REST call val request = ... incompingQueue.put(request) val result = outgoingQueue.poll() return result

更多推荐

如何将Spark Application作为守护程序运行

本文发布于:2023-11-23 22:03:37,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1622977.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:如何将   程序   Spark   Application

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!