- Joined
- Jan 12, 2013
- Messages
- 6,620
Wasn't really sure where else to turn but I did remember someone being able to switch the forums to PHP and mysql.
Anyway I think I am definitely in need some of programmers help - having trouble figuring out the algorithm for this.
Thus far my code in python:
The buffer 0 works correctly - however, the rest don't. The average time for the 100 buffers is around 3.25 secs or so, (low 3's), 0 is 5.75'ish.
I can't think of a way to account for the buffer times - I think it might involve something like average = (sum - (buffers*3))/ 500
any help is appreciated! (i know most people are gonna be like 'Wtf'! Obviously i'm not asking you then! lol
)
Anyway I think I am definitely in need some of programmers help - having trouble figuring out the algorithm for this.
Assume the following:
a. Disk spooling is NOT being used.
b. The printer does NOT have a hardware buffer to hold the output while the printer is printing
(The theme music from Mission Impossible is playing faintly in the background).
SIMULATE the following scenario
A hypothetical program computes for three seconds then outputs a variable length record to be printed. The printer takes from 0.75 to 4.75 seconds (average time is 2.75 seconds) to print each output record. (Use a random number generator)
The hypothetical program loops 500 times for each case of software buffers (0, 1, 2, 3, 4, 5, 10, 25, and 100 software output buffers). Calculate the AVERAGE time for the program to “virtually compute and print” a record from the 500 records, for EACH of the 9 choices of buffer. Plot the results (by hand is OK). The Y axis is from zero to 8 seconds, and the X axis is nonlinear and displays all nine cases of buffers.
Thus far my code in python:
import random
import Queue
q = Queue.Queue()
print("This program simulates a printer printing records while dealing with interrupts over 9 different buffers.")
print("Number of Loops: 500")
print("Number of Buffers: 9")
print("Record Generation Time: 3")
print("Minimum/Maximum print time: .075/4.75")
def main(q):
buffers= [0,1,2,3,4,5,10,25,100];
for buffer in buffers:
sum = 0
for i in range (1, 500):
q.put(i)
time= (random.uniform (0.75,4.75)+3)
sum = sum + time;
average= (sum/500);
while not q.empty():
q.get()
print("For Buffer",buffer,",the average time is",average,".");
main(q)
The buffer 0 works correctly - however, the rest don't. The average time for the 100 buffers is around 3.25 secs or so, (low 3's), 0 is 5.75'ish.
I can't think of a way to account for the buffer times - I think it might involve something like average = (sum - (buffers*3))/ 500
any help is appreciated! (i know most people are gonna be like 'Wtf'! Obviously i'm not asking you then! lol