wajahat
Advanced Member level 4
python com port programming
Hi all,
i am doing socket programming using python.What my code does (i guess)
is that it sends two UDP packets back to back to a relay server.The server sends these packets back to my machine immediately.After receiving the first packet i note the time using 'datetime.datetime.now()'.Similarly for second packet i measure the time.
There has to be a delay between these two packets atleast of the order of 1 microsecond but what my code gives me is zero microsecond.There is something wrong somewhere i can't figure out.
The code is as follows
import socket
import datetime
# creating datagram socket
datagram=socket.socket(socket.AF_INET,socket.SOCK_DGRAM)
HOSTNAME='129.187.223.200'
PORTNO=2000
packet_1='a'
packet_2='b'
time_difference=[]
# creation of packets of size 600
for j in range(1,600):
packet_1=packet_1+'a' #creation of packet 1
packet_2=packet_2+'b' #creation of packet 2
time_difference=[]
#sending packets 200 times
for i in range(1,200):
# sending packet 1
datagram.sendto(packet_1,(HOSTNAME,PORTNO))
# sending packet 2
datagram.sendto(packet_2,(HOSTNAME,PORTNO))
# receiving packet 1
datagram.recvfrom(600)
# reception time of packet 1
first_packet_arrival=datetime.datetime.now()
# receiving packet 2
datagram.recvfrom(600)
# reception time of packet 2
second_packet_arrival=datetime.datetime.now()
# calculating the differnce in reception times in secs
difference_in_time=second_packet_arrival-first_packet_arrival
x=float(difference_in_time.microseconds)
time_difference.append(x/10**6)
print time_difference
------------------------------------------------------------------------------------
.
Please figure whats wrong .
Thanking you in anticipation
regards
wajahat hussain
Hi all,
i am doing socket programming using python.What my code does (i guess)
is that it sends two UDP packets back to back to a relay server.The server sends these packets back to my machine immediately.After receiving the first packet i note the time using 'datetime.datetime.now()'.Similarly for second packet i measure the time.
There has to be a delay between these two packets atleast of the order of 1 microsecond but what my code gives me is zero microsecond.There is something wrong somewhere i can't figure out.
The code is as follows
import socket
import datetime
# creating datagram socket
datagram=socket.socket(socket.AF_INET,socket.SOCK_DGRAM)
HOSTNAME='129.187.223.200'
PORTNO=2000
packet_1='a'
packet_2='b'
time_difference=[]
# creation of packets of size 600
for j in range(1,600):
packet_1=packet_1+'a' #creation of packet 1
packet_2=packet_2+'b' #creation of packet 2
time_difference=[]
#sending packets 200 times
for i in range(1,200):
# sending packet 1
datagram.sendto(packet_1,(HOSTNAME,PORTNO))
# sending packet 2
datagram.sendto(packet_2,(HOSTNAME,PORTNO))
# receiving packet 1
datagram.recvfrom(600)
# reception time of packet 1
first_packet_arrival=datetime.datetime.now()
# receiving packet 2
datagram.recvfrom(600)
# reception time of packet 2
second_packet_arrival=datetime.datetime.now()
# calculating the differnce in reception times in secs
difference_in_time=second_packet_arrival-first_packet_arrival
x=float(difference_in_time.microseconds)
time_difference.append(x/10**6)
print time_difference
------------------------------------------------------------------------------------
.
Please figure whats wrong .
Thanking you in anticipation
regards
wajahat hussain