why would a process vary with the square root of wavelength?
with constant bandwidth, e.g. receptive field area will vary with the square of frequency (of wavelength). the linear size (radius) of the r.f. will vary directly with wavelength. a process in volume would vary with the cube of wavelength. how do you go backwards from here?
okay, so the inhibitory inputs are all squared. i want the weights on these inputs to be proportional to the square root of filter wavelength. i could get a step closer by making the linear inputs proportional to wavelength before the squaring, which changes the question to:
why would a process vary with the inverse of spatial frequency? in my mind, the weights are still tied to the size of the r.f., so that the bigger it is, the more inhibitory connections it has. strictly speaking, this would make inhibition vary with the square of wavelength.
a bigger r.f. would have more inhibition, then. i am just making this up. so, an r.f. that's twice as big would have four times the inhibition. fine, but then why wouldn't it have four times the excitation? they would balance out. but maybe the excitation isn't balanced. maybe excitatory inputs are sparser and sparser for larger r.f.s. is that true?
if it's true, then effectively the gain for different wavelength r.f.s should increase with frequency, because the density of excitatory inputs should increase with frequency.
i feel like this is getting somewhere... atick and redlich, brady and field... somewhere in there...
Monday, June 18, 2012
Saturday, June 16, 2012
stupid lazy
Alright, two posts in a row of me admonishing myself. Publicly, in theory. In theory, this is more embarrassing than it actually is.
It's Saturday evening. I have done nothing all day. Nothing. Played a computer game all morning. Read the front page of the WSJ. Ate a bowl of noodles and drank a pot of coffee. Played some piano. Looked at lots of funny gifs. Tried again to get Endnote properly installed on this stupid computer, and failed. X.0.2 + Office 2007 + Windows 7 = not work.
I have that SID manuscript open. I need to clean it up, add in those two other references I found but haven't really read because they look really dull. They're just 'relevant', in a parallel sense, but nothing obviously consequent. That's what led me into that stupid Endnote cul-de-sac again. I can do it remotely, so what. Just now I opened up the For Authors page on the journal site.
The CI paper is fine. Adding the MTF into the calculations didn't have as big of an effect as I expected, or hoped. Scaled filters are pretty resilient.
I haven't studied Chinese much in a while. I could be doing that.
No, no. God dammit. SID paper. Finish the goddam paper and upload it. There is no excuse. The paper is finished. Send it in. Dammit. I hate you.
It's Saturday evening. I have done nothing all day. Nothing. Played a computer game all morning. Read the front page of the WSJ. Ate a bowl of noodles and drank a pot of coffee. Played some piano. Looked at lots of funny gifs. Tried again to get Endnote properly installed on this stupid computer, and failed. X.0.2 + Office 2007 + Windows 7 = not work.
I have that SID manuscript open. I need to clean it up, add in those two other references I found but haven't really read because they look really dull. They're just 'relevant', in a parallel sense, but nothing obviously consequent. That's what led me into that stupid Endnote cul-de-sac again. I can do it remotely, so what. Just now I opened up the For Authors page on the journal site.
The CI paper is fine. Adding the MTF into the calculations didn't have as big of an effect as I expected, or hoped. Scaled filters are pretty resilient.
I haven't studied Chinese much in a while. I could be doing that.
No, no. God dammit. SID paper. Finish the goddam paper and upload it. There is no excuse. The paper is finished. Send it in. Dammit. I hate you.
Thursday, June 14, 2012
ohhh...
morning, myself. heart.
end. voice. nightingale?
This is just a diversion. There are things in life, every day, that we want to reach out and touch, or interact with, or follow, or watch, but we can't, because there are other things that we have to do instead. Other things that we should do instead. Self control can be suppression of the self, but sometimes it is just being rational, maintaining normal, keeping things the way you want them. Your mind is made up of many different parts which, on their own, are not as intelligent as you are. They don't have the same priorities as you. They don't even have the same memories as you - some of them have only existed for a few days, or months, or years. Maybe, some of them, you can remember when they came into existence. You can remember, because you are the one governing the rest, corralling them. You have to choose, at these instances, what to do - even if these things in life are like lures, and you see that between you and this other possibility, even just what ultimately would be a fleeting bit of soon-to-be-nothing, is a transparent membrane of a single impulse.
It's just a diversion. Maybe don't go back there. Maybe come back here, and see what you did, to keep from going there. Remember what there is in other places. Keep things level. Life is hard.
Wednesday, June 13, 2012
monitor MTF? sure!
Okay, so I need to know the spatial frequency transfer function of the monitor I've used to do most of my experiments over the past couple of years. I've never done this before, so I go around asking if anyone else has done it. I expected that at least B** would have measured a monitor MTF before, but he hadn't. I was surprised.
Still, B**'s lab has lots of nice tools, so I go in there to look around, and lo, T** is working on something using exactly what I need, a high speed photometer with a slit aperture. So today I borrowed it and set to work doing something I had never done and didn't know how to do. It was great fun.
D** helped me get the photometer head fixed in position. We strapped it with rubber bands to an adjustable headrest. I've started by just measuring along the raster. The slit is (T** says) 1mm, which is about 2.67 pixels on my display. I drifted (slowly) squarewave gratings with different wavelengths past the aperture - this was more complicated than it sounds. The monitor is run at 100Hz, and CRTs flash frames very rapidly, just a millisecond, so getting the photometer settings just right (it runs at 18Khz) took a bit of adjustment, and figuring out good settings for the gratings, slow-enough speed to drift them at (I'm limited by the 10 second block limit imposed by the photometer)..
Anyways, I got back temporal waveforms which I treat as identical to the spatial waveforms. As expected, the power of these waveforms drops off as the gratings get finer. But, I know that it drops off too fast, because of the aperture. If the aperture were exactly 1 pixel across, and if it were aligned precisely with the raster, and if a bunch of other things were true, then I could know that each epoch recorded by the photometer reflected the luminance of a pixel, and my measurements would reflect the monitor MTF. But, like I said, the aperture is 1mm, so each 10ms epoch is an aliased average over >2 pixels. I'm not even thinking about the reflections from the photometer head (there's a metal rim to the aperture T** had taped on there).
My solution: code an ideal monitor, record from it with the same sized aperture, and divide it out of the measurements. I can then guess a blur function - Gaussian - and fit that to my (4) data points. That's what I did: here is my first estimate of the vertical MTF of my Dell p1130 Trinitron:
Still, B**'s lab has lots of nice tools, so I go in there to look around, and lo, T** is working on something using exactly what I need, a high speed photometer with a slit aperture. So today I borrowed it and set to work doing something I had never done and didn't know how to do. It was great fun.
D** helped me get the photometer head fixed in position. We strapped it with rubber bands to an adjustable headrest. I've started by just measuring along the raster. The slit is (T** says) 1mm, which is about 2.67 pixels on my display. I drifted (slowly) squarewave gratings with different wavelengths past the aperture - this was more complicated than it sounds. The monitor is run at 100Hz, and CRTs flash frames very rapidly, just a millisecond, so getting the photometer settings just right (it runs at 18Khz) took a bit of adjustment, and figuring out good settings for the gratings, slow-enough speed to drift them at (I'm limited by the 10 second block limit imposed by the photometer)..
Anyways, I got back temporal waveforms which I treat as identical to the spatial waveforms. As expected, the power of these waveforms drops off as the gratings get finer. But, I know that it drops off too fast, because of the aperture. If the aperture were exactly 1 pixel across, and if it were aligned precisely with the raster, and if a bunch of other things were true, then I could know that each epoch recorded by the photometer reflected the luminance of a pixel, and my measurements would reflect the monitor MTF. But, like I said, the aperture is 1mm, so each 10ms epoch is an aliased average over >2 pixels. I'm not even thinking about the reflections from the photometer head (there's a metal rim to the aperture T** had taped on there).
My solution: code an ideal monitor, record from it with the same sized aperture, and divide it out of the measurements. I can then guess a blur function - Gaussian - and fit that to my (4) data points. That's what I did: here is my first estimate of the vertical MTF of my Dell p1130 Trinitron:
The Nyquist limit for this display, at the distance modeled here, is about 23cpd, so I guess this Gaussian is in about the right place. It's hard to believe, though, because horizontal 1-pixel gratings look so sharp on this display. I feel like these must be underestimates of the transfer. I am nervous about how awful the vertical will be...
*edit*
It wasn't too bad, just a bit blurrier than the horizontal. Still makes me suspicious that I'm underestimating the horizontal. Not going to bother putting up plots, but here's my estimate of the pixel spread function (you can just see that it's a little broader left-right, that's the vertical blur):
Subscribe to:
Comments (Atom)

