-
-
Notifications
You must be signed in to change notification settings - Fork 387
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
VerneMQ vmq_bridge feature #2178
Comments
Please check the log files on both sides. Is the bridge able to connect? (is A reachable on a network level from B?). 👉 Thank you for supporting VerneMQ: https://github.com/sponsors/vernemq |
Hello @ioolkos, many thanks for comment. This is the logs from machine B 2023-08-03 14:41:52 09:11:52.968 [info] Bridge br0 connected to IpAddress:1883.
|
This should work, yes. 👉 Thank you for supporting VerneMQ: https://github.com/sponsors/vernemq |
Yes I have verified the username and password at B-end is correct. No problem in that, because I can connect with the user name and passowrd in Machine B, only can not see the Machine A messages. Any suggestion? |
Hi @ioolkos Somehow I find the problem, this cause the problem to me. listener.tcp.default = 0.0.0.0:1883 listener.ssl.default = 0.0.0.0:8883 When I configured above values in .conf file, it's not working at all. But when I change it to like below it's working listener.tcp.name = 127.0.0.1:1883 But I want above configuration file to be placed. |
@ankitshah197 this doesn't look like a Verne issue to me. Maybe some connectivity thing on a Docker level? 👉 Thank you for supporting VerneMQ: https://github.com/sponsors/vernemq |
Hi @ioolkos It seems like problem in mountpoint. When I removed some of the line from below, it's started working listener.tcp.default = 0.0.0.0:1883 ******** Removed ********************* listener.ssl.default = 0.0.0.0:8883 ******** Removed ********************* After that I checked #205 and it seems like brige plugin currently only support the default mountpoint "", so the message are never picked up by the bridge. Do you have any update on this? It seems that the above issue from 2016, there could be lot change after that. |
@ankitshah197 apologies, I didn't catch you configured mountpoints. In Verne, mountpoints separate message traffic. 👉 Thank you for supporting VerneMQ: https://github.com/sponsors/vernemq |
Yes, that should be expected behaviour. But that's not happening. |
Must look into it. Thanks for verifying and reporting this! 👉 Thank you for supporting VerneMQ: https://github.com/sponsors/vernemq |
Requirement
Forward all the messages from Machine A to Machine B. I have configured bridge in Machine B. Below is my .conf file
plugins.vmq_bridge = on
vmq_bridge.tcp.br0 = "some address due to privacy":1883
vmq_bridge.tcp.br0.cleansession = on
vmq_bridge.tcp.br0.client_id = BridgeMM
vmq_bridge.tcp.br0.keepalive_interval = 60
vmq_bridge.tcp.br0.username = user
vmq_bridge.tcp.br0.password = password
vmq_bridge.tcp.br0.topic.1 = * in
vmq_bridge.tcp.br0.restart_timeout = 10
vmq_bridge.tcp.br0.try_private = off
vmq_bridge.tcp.br0.max_outgoing_buffered_messages = 1000
Expected behavior
When I connect to Machine B (Mqtt://Ipaddress:1883), I can see all the messages of Machine A into Machine B.
Actual behaviour
I can see all the messages of Machine A with username "user" and password "password" while connecting it via MQTT explorer. But I can not see any message in Machine B.
The text was updated successfully, but these errors were encountered: