Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Hi, sorry for late reply, In our timeline we should have something before December, I will let you know as soon as we have a version where this is solved!
Irie, So sorry for the late reply! I have reached out to our engineers again to get a propper answer, will get back to you as soon as possible.
Hi, thanks for reaching out. We have this overview of the URL and what connections are needed: Download docker images: URL: registry-1.docker.io Protocols: HTTPS, TCP Ports: 443 Video stream: URLs: eu.robotics.cognitedata.com OR us.robotics.cognitedata.com Protocols: RTP, UDP Ports: 20000-40000, 5000-5050 Video signaling: URLs: eu.robotics.cognitedata.com Protocols: HTTPS, TCP Ports: 443 Cognite API: URLs: {cognite cluster}.cognitedata.com (example: api.cognitedata.com) Protocols: HTTPS, GRPC, TCP Ports: 443 Azure auth: URL: login.microsoftonline.com Protocols: HTTPS, TCP Ports: 443 The laptop using InRobot will also need to have whitelisted: Video stream, Video signaling, Cognite and Azure auth. I have also added a architecture drawing where you see the interaction between Inrobot, CDF, videostream and the robot. In addition I would recomend using firefox for debuging. If you visit “about:webrtc” url on firefox you get to a webrtc debugging tool where you can
Ishikawa, thank you for reaching out! This is a very good point. We are storing the direction in the waypoints, but are currently not visualizing it. Just a quick question. Is the direction most relevant when you want to choose a waypoint. or do you want to see directions for all waypoints? Regardless, we will have a look and we will give you a timeline for when a solution could be available. Again, thank you for reaching out!
Ishikawa, Thank you again for reaching out! Similarly this will be fixed by when we allow for mission configuration parameters such as walking speed and collision avoidance distance etc. The timeline for this is after summer, so Q3. Will keep you updated on the timeline. The first release will be to set parameters to the robot, then in the future we will look into how specific missions can be configured with speed, collision avoidance etc.
@Elias Treis , I believe you have also requested this?
Ishikawa, Thank you for reaching out! We have gotten the feedback before, and are looking into a way we can configure mission parameters such as walking speed and collision avoidance distance etc. The timeline for this is after summer, so Q3. Will keep you updated on the timeline.
I agree, AGV will play an important role, and there are many comonalities between AGV and the likes of drones and Spot. However at this point, InRobot is focusing on using robots for capturing data that can be used for inspection, check and report etc. I agree it is not all of APM but it is the part we are focusing on. So a lot of the UI would not make sense for a AGV. That being said, we have a robotics API that InRobot is using, where streaming of positions in 3D, route planing, maps and defining list of tasks can be done.
Hi @ibrahim.alsyed , thanks for reaching out!we have not integrated with any AGV systems to this date, InRobot is curently focusing within the Asset Performance Management space especially around field operations. We have a API and infrastructure that could be utilized but not sure this would be the best fit to be honest.
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.