问题描述
使用Netty,我从多个线程上的框架接收多个异步消息。我需要将这些消息发送到使用同步,有状态协议的网络设备(UDP)。因此,我需要使用一个状态变量,并且一次只允许发送一条消息,只有在客户端处于“空闲”状态时才应该发生。
此外,状态机将需要在队列中等待的内容之前发送其自身内部生成的消息-重试。对于这种用例,我知道如何将消息注入到管道中,只要出站消息可以保留在管道的顶部,该方法就可以工作。
有什么想法如何使用客户端状态控制输出?
TIA
解决方法
我已经提出了概念验证/提议的解决方案,但希望有人知道更好的方法。尽管此操作按预期进行,但会引入许多不良副作用,需要解决。
- 在其自己的线程上运行阻塞写处理程序。
- 如果状态不空闲,则将线程置于睡眠状态,并在状态变为空闲时将其唤醒。
- 如果状态为空闲或变为空闲,请按其方式发送消息。
这是我使用的引导程序
public class UDPConnector {
public Init() {
this.workerGroup = EPOLL ? new EpollEventLoopGroup() : new NioEventLoopGroup();
this.blockingExecutor = new DefaultEventExecutor();
bootstrap = new Bootstrap()
.channel(EPOLL ? EpollDatagramChannel.class : NioDatagramChannel.class)
.group(workerGroup)
.handler(new ChannelInitializer<DatagramChannel>() {
@Override
public void initChannel(DatagramChannel ch) throws Exception {
ch.pipeline().addLast("logging",new LoggingHandler());
ch.pipeline().addLast("encode",new RequestEncoderNetty());
ch.pipeline().addLast("decode",new ResponseDecoderNetty());
ch.pipeline().addLast("ack",new AckHandler());
ch.pipeline().addLast(blockingExecutor,"blocking",new BlockingOutboundHandler());
}
});
}
}
阻止出站处理程序如下
public class BlockingOutboundHandler extends ChannelOutboundHandlerAdapter {
private final Logger logger = LoggerFactory.getLogger(BlockingOutboundHandler.class);
private final AtomicBoolean isIdle = new AtomicBoolean(true);
public void setIdle() {
synchronized (this.isIdle) {
logger.debug("setIdle() called");
this.isIdle.set(true);
this.isIdle.notify();
}
}
@Override
public void write(ChannelHandlerContext ctx,Object msg,ChannelPromise promise) throws Exception {
synchronized (isIdle) {
if (!isIdle.get()) {
logger.debug("write(): I/O State not Idle,Waiting");
isIdle.wait();
logger.debug("write(): Finished waiting on I/O State");
}
isIdle.set(false);
}
logger.debug("write(): {}",msg.toString());
ctx.write(msg,promise);
}
}
最后,当StateMachine过渡到空闲状态时,该块被释放
Optional.ofNullable((BlockingOutboundHandler) ctx.pipeline().get("blocking")).ifPresent(h -> h.setIdle());
所有这些都会导致出站消息与设备发出的同步,全状态响应同步。
当然,我希望不必处理额外的线程以及它们所带来的同步。我也不知道我会以这种方式遇到什么样的“尚待发现”的问题。这样做的副作用是使主处理程序被多个线程访问,这只是一个新问题,需要解决。
接下来,还要执行超时并使用退避重试;该线程无法无限期地保持阻塞状态。
15:07:16.539 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2] REGISTERED
15:07:16.540 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2] CONNECT: portserver1.tedworld.net/192.168.2.173:2102
15:07:16.541 [DEBUG] [internal.connection.UDPConnectorNetty] - connect(): connect() complete
15:07:16.541 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102] ACTIVE
15:07:16.542 [DEBUG] [c.projector.internal.ProjectorHandler] - scheduler.execute: creating test message
15:07:16.543 [DEBUG] [c.projector.internal.ProjectorHandler] - scheduler.execute: sending test message
15:07:16.544 [DEBUG] [internal.connection.UDPConnectorNetty] - sendRequest: Adding msg to queue { super={ messageType=21,channelId=test,data= } }
15:07:16.546 [DEBUG] [c.projector.internal.ProjectorHandler] - scheduler.execute: Finished
15:07:16.545 [DEBUG] [rnal.protocol.BlockingOutboundHandler] - write { super={ messageType=21,data= } }
15:07:16.547 [DEBUG] [internal.connection.UDPConnectorNetty] - sendRequest: Adding msg to queue { super={ messageType=3F,channelId=lamp,data= } }
15:07:16.548 [DEBUG] [rnal.protocol.BlockingOutboundHandler] - write(): I/O State not Idle,Waiting
15:07:16.548 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102] WRITE: 6B
+-------------------------------------------------+
| 0 1 2 3 4 5 6 7 8 9 a b c d e f |
+--------+-------------------------------------------------+----------------+
|00000000| 21 89 01 00 00 0a |!..... |
+--------+-------------------------------------------------+----------------+
15:07:16.550 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102] FLUSH
15:07:16.567 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102] READ: DatagramPacket(/192.168.2.173:2102 => /192.168.2.186:47010,PooledUnsafeDirectByteBuf(ridx: 0,widx: 6,cap: 2048)),6B
+-------------------------------------------------+
| 0 1 2 3 4 5 6 7 8 9 a b c d e f |
+--------+-------------------------------------------------+----------------+
|00000000| 06 89 01 00 00 0a |...... |
+--------+-------------------------------------------------+----------------+
15:07:16.568 [DEBUG] [nternal.protocol.ResponseDecoderNetty] - decode DatagramPacket(/192.168.2.173:2102 => /192.168.2.186:47010,cap: 2048))
15:07:16.569 [DEBUG] [rojector.internal.protocol.AckHandler] - channelRead0 { super={ messageType=06,data= } }
15:07:16.570 [DEBUG] [rnal.protocol.BlockingOutboundHandler] - setIdle called
15:07:16.571 [DEBUG] [rnal.protocol.BlockingOutboundHandler] - write(): Finished waiting on I/O State
15:07:16.571 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102] READ COMPLETE
15:07:16.571 [DEBUG] [rnal.protocol.BlockingOutboundHandler] - write { super={ messageType=3F,data= } }
15:07:16.573 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102] WRITE: 6B
+-------------------------------------------------+
| 0 1 2 3 4 5 6 7 8 9 a b c d e f |
+--------+-------------------------------------------------+----------------+
|00000000| 3f 89 01 50 57 0a |?..PW. |
+--------+-------------------------------------------------+----------------+
15:07:16.573 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102] FLUSH
15:07:16.587 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,6B
+-------------------------------------------------+
| 0 1 2 3 4 5 6 7 8 9 a b c d e f |
+--------+-------------------------------------------------+----------------+
|00000000| 06 89 01 50 57 0a |...PW. |
+--------+-------------------------------------------------+----------------+
15:07:16.588 [DEBUG] [nternal.protocol.ResponseDecoderNetty] - decode DatagramPacket(/192.168.2.173:2102 => /192.168.2.186:47010,cap: 2048))
15:07:16.589 [DEBUG] [rojector.internal.protocol.AckHandler] - channelRead0 { super={ messageType=06,data= } }
15:07:16.590 [DEBUG] [rnal.protocol.BlockingOutboundHandler] - setIdle called
15:07:16.591 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102] READ COMPLETE
15:07:16.592 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,widx: 7,7B
+-------------------------------------------------+
| 0 1 2 3 4 5 6 7 8 9 a b c d e f |
+--------+-------------------------------------------------+----------------+
|00000000| 40 89 01 50 57 30 0a |@..PW0. |
+--------+-------------------------------------------------+----------------+
15:07:16.593 [DEBUG] [nternal.protocol.ResponseDecoderNetty] - decode DatagramPacket(/192.168.2.173:2102 => /192.168.2.186:47010,cap: 2048))
15:07:16.594 [DEBUG] [.netty.channel.DefaultChannelPipeline] - Discarded inbound message { super={ messageType=40,data=30 } } that reached at the tail of the pipeline. Please check your pipeline configuration.
15:07:16.595 [DEBUG] [.netty.channel.DefaultChannelPipeline] - Discarded message pipeline : [logging,encode,decode,ack,blocking,DefaultChannelPipeline$TailContext#0]. Channel : [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102].
15:07:16.596 [DEBUG] [.netty.handler.logging.LoggingHandler] - [id: 0xb19e73e2,L:/192.168.2.186:47010 - R:portserver1.tedworld.net/192.168.2.173:2102] READ COMPLETE