robtarget (robot target) is used to define the position of the robot and external axes.
if we see above home target, there is 4 different coordinate. i will explain about those 4 components.
[785.989150989,33.049684324,1127.381297276] —- this is called translation. in shortcut it is represented by trans. this explain the position of the tool center point expressed in the mm. the coordinate is represented as [x, y, z] respectively.
The position is specified in relation to the current object coordinate system, including program displacement. If no work object is specified then this is the world coordinate system.
2. [0.063923372,-0.907835521,0.01028962, -0.414297711] —- this value is called rotation or orientation of the tool. the coordinate is expressed as [q1, q2, q3, q4] also called quaternion which consists of four components: q1, q2, q3, and q4.
The orientation is specified in relation to the current object coordinate system including program displacement. If no work object is specified then this is the world coordinate system.
The orientation must be normalized; that is, the sum of the squares must equal 1: q12 +q22+q32+q42 = 1
The orientation of a coordinate system (such as that of a tool) is described by a rotational matrix that describes the direction of the axes of the coordinate system in relation to a reference system (see the following figure)
3. [0,0,-3,0] —- This represents the configuration of axis. The axis configuration of the robot (cf1, cf4, cf6, and cfx). This is defined in the form of the current quarter revolution of axis 1, axis 4, and axis 6. The quadrant number is connected to the current joint angle of the axis. For each axis, quadrant 0 is the first quarter revolution, 0 to 90°, in a positive direction from the zero position; quadrant 1 is the next revolution, 90 to 180°, and so on. Quadrant -1 is the revolution 0° to (-90°), and so on. The figure shows the configuration quadrants for axis 6.
4. [9E+09,9E+09,9E+09,9E+09,9E+09,9E+09] — this represents the external axis . If the axes is rotating then it represented in degree, where as if linear it represented in mm.
The value 9E9 is defined for axes which are not connected. If the axes defined in the position data differ from the axes that are actually connected at program execution then the following applies: If the position is not defined in the position data (value 9E9) then the value will be ignored. If the position is defined in the position data although the axis is not connected then the value is ignored.
in the above value, all data are 9E9 which means they are undefined. the main figure in the beginning tell us that we have not used the external joints values.
Hi, in this blog post I am writing about how to start and finish the work command in robot studio and actually work on it.
Open the robot studio: (for this instance we have a Abb 1600 10 kg 1.45m robot arm)
Import all the 3d object that needed for the simulation. the furniture arrangement in the simulation should be exactly same in the real environment.
Attach the required gripper that the project is using and attach it to the robot studio.
The robot studio environment looks like below,
furniture arrangement and robot alignment for the simulation.
The input output signal if needed need to be make here, if we are moving any object but that need to be in original position for another cycle, we can use positioner that also should be used here.
We need to make the target point to simulate the path. we either can put the x, y, z values in the box that robot studio provide, we can select where we want to put the target or we can manually jog the virtual robot arm and use teach target function. all 3 are shown below respectively.
option 1left click in mouse when you see the symbol shown by arrow in the required space.click no. 1 then jog the robot and select teach target to create a target.
We need to create all the target that robot need to follow, also check the angle of the target, the angel is very important because it guide 4, 5, and 6 axes specially and that make the path easy and hard. the angle is also another major cause of configuration error, which make the programming complex. so we need to check each program carefully.
As my case after creating all the required target is look like picture below.
all the created target to make the path
After finalizing the path, we can make the path, generally we use MoveL or MoveJ.
MoveL is used to move the tool center point (TCP) linearly to a given destination. When the TCP is to remain stationary then this instruction can also be used to re orientate the tool.
MoveJ is used to move the robot quickly from one point to another when that movement does not have to be in a straight line. The robot and external axes move to the destination position along a non-linear path. All axes reach the destination position at the same time.
the final path generation is looks like picture below.
we can now do the simulation if everything goes well. the probable problems might be with path configuration error, robot out of reach etc. those are happening with certain targets that need to be address one by one.
the final simulation video can be watch below,
The computer need to be connect with the IRC5 compact controller by using Ethernet cable. one end need to be connect to the computer Ethernet connector whereas another end need to be connect at the service port of irc5 compact controller.
Then we need to add the controller by one click connector which will connect the robot to the robot studio.
The rapid code need to be copied in the real robot controller, which need to be done when the robot is in manual model. After finishing the upload, we need to change the controller to automatic mode and play the simulation in real robot.
The velocity need to be changed to the practical, if the velocity is fast, the robot might shake violently and stop the simulation.
The whole process video will be made an upload soon.
We collect data by using sensors. ( the sensors are ready but we still need to deploy the sensors and collect the data. for now we are using fake data.)
We have already fix the sensor placement locations. which we will use.
the data will be visualize in the form of cube and cylinder. the final product will be look like below.
in the first as code below i first read the csv then i make the rectangle on the given sensor value point.
i was success first then again it cannot show the rectangle when i tried.
** i solve the problem for this one in this morning. i forget to close the file after opening , the changes is also in below.
import Rhino
import Rhino.Geometry as rg
import scriptcontext as sc
import System.Drawing
import rhinoscriptsyntax as rs
import csv
def readFile():
f = open('05 28.csv', 'r')
rd = csv.reader(f)
ptlist = []
slist1 = []
slist2 = []
slist3 = []
slist4 = []
for line in rd:
x=float(line[0])
y=float(line[1])
z=float(line[2])
sensor1 = float(line[4])
sensor2 = float(line[5]*1000)
sensor3 = float(line[6]*10000)
sensor4 = float(line[15]*10000)
pt = (x,y,z-165000)
s1 = (x,y,z+(sensor1*10000))
s2 = (x,y,z+sensor2)
s3 = (x,y,z+sensor3)
s4 = (x,y,z+sensor4)
ptlist.append(pt)
slist1.append(s1)
slist2.append(s2)
slist3.append(s3)
slist4.append(s4)
return rd * this should be removed
f.close() * the correct one
print(ptlist)
sensor1 = rs.AddPointCloud(slist1)
for point in slist1:
x = 10000
y = 10000
point1 = ((point[0]-x/2), (point[1]-y/2), point[2])
rs.AddRectangle(point1, x,y)
return slist1
if __name__=="__main__":
#AddMaterial()
#MakeMaterial()
readFile()
then i decide to try using the box, i thought i can extrude the rectangle in z direction to make a box, which i find impossible. i did as teach in various sources in internet, but when i try to make the box it shows this error.
the code is following
import Rhino
import rhinoscriptsyntax as rs
x = 652356
y = 1236540
z = 167550
l = 10000
b = 10000
h = 500
pt1 = (x-0.5*l, y-0.5*b, z)
pt2 = (x-0.5*l, y+0.5*b, z)
pt3 = (x+0.5*l, y+0.5*b, z)
pt4 = (x+0.5*l, y-0.5*b, z)
pt5 = (x-0.5*l, y-0.5*b, z+h)
pt6 = (x-0.5*l, y+0.5*b, z+h)
pt7 = (x+0.5*l, y+0.5*b, z+h)
pt8 = ((x+0.5*l, y-0.5*b, z+h)
rs.AddBox([pt1,pt2,pt3,pt4,pt5,pt6,pt7,pt8])
as i planned this will create a box at the point, but couldn’t do.
I am currently working in the aromatic robot arm wall construction research project.
First i did the the simulation of wood strut wall construction manulaly.
I generate the path by taking the points for each struts.
This take a lot of time.
Here first i will explain how now to work and then i will explain how to work.
But the key thing to know is that “ DON’T USE YOUR ENERGY TO WORK IN THE THINGS THAT COMPUTER CAN DO, MAKE A COMPUTER TO DO THAT WORK.”
HOW NOT TO GENERATE THE PATH.
the video in below is generated my manual path making,
here is another video where i make the wood strut wall manually, this take my continuously 11 hour, i did all manually which i shouldn’t do.
Here, first i upload the geometry wall file that i created in rhino in the format of .sat
Which remove all the lines and points and only allow me to use geometry.
Then i create the each new geometry component so that i can show and hide the particular geometry.
All the process of making the wall by manually is in this video below.
2. AUTOMATED PATH GENERATION PROCESS.
Here first we make the path in the rhino, by using rhinopython.
The python code attach herewith at the last will generate the the path as a single poly line.
The line is then exported in the .dxf file which allows to upload lines in the Robot studio.
The line then transfer to the auto path making.
The path automatically generated by the robot studio have problem with the direction of target.
So i aligned all the target path with one target i generate manually with required direction.
Most of the path worked after that. some path might be out of reach, which need to fixed with selecting less wall for one time in rhino, since the robot cannot reach the target. or we can adjust the robot position so that it can reach all the target.
The brick pattern with frame is first constructed.
after that the center point of the brick wall are generated as point 1
The point 2 are 350 mm above of each robot.
reference points
Like that i create other 2 more point for the brick source and the middle point.
I need each different poly line for each path through points. but i couldn’t do it. when i make the poly line it connect its each points making unneeded poly line.
But i can create a required sets of line from one point to another. then i have 2 segment of poly line.
If i cannot make a singe poly line i will join the line each of them physically and make a robot path. then i will have to create a rapid code for those path.
as in the above figure, i want to create a poly line by connecting point 1, 2, 3, 4. but the line should connect according to the row.