I moved into a new apartment, with my old roommate Bob and his best friend Kyle. It seems like a good setup so far. It's a really nice place in the Richmond district - all hardwood floors, a fireplace, lots of space, half a block from Golden Gate Park and a little over a mile from the ocean. I've been running outside again, on trails! I'd forgotten how much I miss that. We got a good deal on the place, too. (God bless rent control...)
I took a week off (which was not really relaxing at all, due to the move and to several other factors), and picked my summer lab rotation: I'll be working in a synthetic biology lab that does work on 4th-generation biofuel production. The PI outlined my project description with me on Thursday. I had explained to him that I had a background in physics, and that was where my real technical interests lay, so he proposed I work on building a dynamic model connecting the current guess-and-check transfer functions of the quorum-sensing bacterial NOR logic gates with the statistical physics-based RBS (ribosome binding site) calculator. Both the logic gates and the RBS calculator are already developed; my job is to connect the two with a kinetic mathematical model. I'm doing some reading on signal processing at the moment, trying to come up with a general strategy. I'm excited about this, though; this is the sort of work I consider myself good at, so I'm hopeful that I can create something genuinely useful here. The big picture of this is that if I can assemble a good kinetic framework for transfer function prediction that goes back to the statistical mechanics of ribosome-DNA binding, this will allow more efficient, logical genetic circuit design, which will help streamline the biofuel production work.
I've decided to sell my car and buy a motorcycle. (Right now I am negotiating with a guy down in the south bay for an almost-new Suzuki DR650, which is pretty much my ideal on-road/off-road bike.) I miss my old Nighthawk S from college, and owning a car in San Francisco just isn't practical. Plus, with gas prices the way they are, I think I can get a good deal for my Civic. With any luck, I should have a significant chunk of money left over, too, which, along with my savings, should give me enough to complete my private pilot's license and still have some money set aside for a rainy day. I looked into pilot training, and as far as I can tell, it looks like it'll either my San Mateo or (more likely) San Carlos airport. I'm really looking forward to starting. Feels like I've been saving up for this forever, and I'm excited about finally getting to work on it!
Sunday, June 29, 2008
Thursday, June 12, 2008
Training
I'm attempting to teach myself some Python. As a (particularly useless) exercise, I thought I'd try to write a perceptron neural net. I think I've got it working! As far as I can tell this program works for any linearly separable goal. I'm sort of proud of it, so I thought I'd post the listing here:
# perceptron.py: single-neuron perceptron neural network
# (c) Pericles v. 2.0, 6/12/2008
import sys
import random
input = [[0, 0], [0, 1], [1, 0], [1, 1]]
# randomize on (0, 1] both state vector elements
state = [random.random(), random.random()]
# randomize on (0, 1] bias value
bias = random.random()
# initial output list
output = [0, 0, 0, 0]
# desired output list: simple OR (this program can learn any
# linearly separable set)
goal = [0, 1, 1, 1]
# hard limit evaluation function: if dot product of the state and input
# vectors is greater than or equal to the bias, then neuron fires
def eval(row, col, bias):
product = 0
for i in range(len(row)):
product += row[i] * col[i]
if product >= bias:
fire = 1
else:
fire = 0
return fire
# transition function: compare the binary values of the current output
# element to the goal element, and adjust the state vector if different
def transition(state, input, output, goal, bias):
b = bias
s = state
# for every output element that is not equal to its corresponding goal
# element, adjust the state vector and the bias
if output != goal:
for i in range(len(s)):
s[i] += (goal - output) * input[i]
b -= goal - output
return s, b
# main
print "Simple OR:", goal
print "----------------------------"
print "Output State Bias"
print "----------------------------"
for j in range(100):
# iterate the 4 input vectors through the evaluation function
for i in range(len(input)):
output[i] = eval(state, input[i], bias)
# if the output vector has reached the goal vector, then quit
if output == goal:
print output, " ", state, " ", bias
print "Training complete!"
sys.exit()
# iterate the output values through the transition function, and
# retrieve the adjusted values for the state vector and the bias
for i in range(len(output)):
state, bias = transition(state, input[i], output[i], goal[i], bias)
print output, " ", state, " ", bias
Tuesday, June 10, 2008
Epoch
Spring quarter has drawn to a close: my rotation ends this Friday, and my classes are done. I have a take-home final to complete, and then my first year in graduate school has concluded.
I think I will do a 4th rotation this summer, in a lab that does work related to biofuels. I've always been interested in the aging process, but I think so long as there is an important big picture behind my work, I will be satisfied with my research. Aging is certainly a relevant and important problem, but so is the energy problem, after all.
I'm planning on taking a couple of weeks of downtime before I start, however, to rest, and to teach myself a few things that have come up over and over again this year that I have little to no background in. First up: statistics! I'm armed with a fairly rigorous mathematical statistics/data analysis book, and I'm going to bull through this thing before I take another step scientifically. I'm also hip-deep into learning Python, the other thing I'm dead set on becoming competent with before I jump into my next (and possibly final?) lab.
Also, I've made up my mind about the whole last name thing -- I think I'm really going to do it! I talked to my folks about it, and everybody was super cool with it, surprisingly. Dad said that he liked having a unique last name, but the whole unpronounceable-ness aspect of it kinda bugged him, too. Anyway, I decided that Peterson would be a pretty good choice. I like the sound of it, and it's easy to pronounce. Dad liked it, too. You know, I've never liked the naming convention in our culture, because our names don't mean anything. If you think about common names like Smith or Carpenter or whatever, people took on those names because they were descriptive -- they actually were blacksmiths and carpenters. So the name Peterson appeals to me on this level, because it's at least descriptive -- I'm Peter's son. It's not Greek, but I figure, I'm a quarter Greek, and a quarter mixed northern European, so I don't feel like I'm misrepresenting my ancestry. The only downside is that it is sort of common, which isn't the greatest thing, I guess, but then, I figure, if my uniqueness as a person is tied to having a super hard to pronounce last name, I probably fail at life anyway!
I think I will do a 4th rotation this summer, in a lab that does work related to biofuels. I've always been interested in the aging process, but I think so long as there is an important big picture behind my work, I will be satisfied with my research. Aging is certainly a relevant and important problem, but so is the energy problem, after all.
I'm planning on taking a couple of weeks of downtime before I start, however, to rest, and to teach myself a few things that have come up over and over again this year that I have little to no background in. First up: statistics! I'm armed with a fairly rigorous mathematical statistics/data analysis book, and I'm going to bull through this thing before I take another step scientifically. I'm also hip-deep into learning Python, the other thing I'm dead set on becoming competent with before I jump into my next (and possibly final?) lab.
Also, I've made up my mind about the whole last name thing -- I think I'm really going to do it! I talked to my folks about it, and everybody was super cool with it, surprisingly. Dad said that he liked having a unique last name, but the whole unpronounceable-ness aspect of it kinda bugged him, too. Anyway, I decided that Peterson would be a pretty good choice. I like the sound of it, and it's easy to pronounce. Dad liked it, too. You know, I've never liked the naming convention in our culture, because our names don't mean anything. If you think about common names like Smith or Carpenter or whatever, people took on those names because they were descriptive -- they actually were blacksmiths and carpenters. So the name Peterson appeals to me on this level, because it's at least descriptive -- I'm Peter's son. It's not Greek, but I figure, I'm a quarter Greek, and a quarter mixed northern European, so I don't feel like I'm misrepresenting my ancestry. The only downside is that it is sort of common, which isn't the greatest thing, I guess, but then, I figure, if my uniqueness as a person is tied to having a super hard to pronounce last name, I probably fail at life anyway!
Subscribe to:
Posts (Atom)