SED Technology

Tahir2

Veteran
Supporter
A recap of what is happening with SED is in order I believe so here goes:

Canon and Toshiba showcased SED technology on 12/01/2006 (January 12th for you guys across the pond).


The product: It's not officially a product yet, meaning Toshiba didn't announce final pricing or availability, but on paper, SED, codeveloped with camera maker Canon, looks promising. A flat-panel display technology, SED (Surface-conduction Electron-emitter Display) uses phosphors activated by an electron emitter, just like standard CRT tube televisions. Supposedly, the result is tube-level picture quality in a flat form factor. Details were sketchy, but the first model should be 50 inches in size and have full 1,920x1,080 resolution.

Canon and Toshiba formed a joint SED TV display company on 15/09/2004 called SED Inc.


Imaging giants Canon and Toshiba said Wednesday they are joining forces to create a separate company to focus on SED flat-panel displays. The new company, SED Inc., is slated to begin operation next month, Canon and Toshiba said. It will be based in Japan and employ about 300 people.


There was a previous showcase of SED TV's 23/8/2005

[FONT=verdana, arial, geneva, sans-serif][SIZE=-1]SED technology has been under development for more than 20 years and is being positioned by Canon and Toshiba as a better option for large-screen TVs than PDP (plasma display panel) technology. SED panels can produce pictures that are as bright as CRTs (cathode ray tubes), use up to one-third less power than equivalent size PDPs and don't have the slight time delay sometimes seen with some other flat-panel displays, according to the companies.[/SIZE][/FONT]

Canon have been researching SED since 1986 and began joint development with Toshiba in 1999


Canon began research in the field of SED technology in 1986 and, in 1999, began joint development activities with Toshiba with the aim of commercializing an SED product. In light of the progress realized at this stage of the joint development process, Canon and Toshiba, deeming the timing appropriate, agreed upon the establishment of a joint venture. Plans for the new company call for the commercialization of SED panels primarily for large-screen flat-panel televisions, with production scheduled to begin in 2005. Following the initial launch, a mass-production factory will be readied and production volume will be increased.
Canon's technology primer on SED

The key to the electron emitters, at the heart of the SED, is an extremely narrow slit several nanometers wide between two electric poles. Electrons are emitted from one side of the slit when approximately 10V of electricity are applied. Some of these electrons are scattered at the other side of the slit and accelerated by the voltage (approximately 10 kV) applied between the glass substrates; causing light to be emitted when they collide with the phosphor-coated glass plate.

Edit: added quote
 
If it forms static magnetic fields around the screen/between screen and user, and/or emits X-rays (high probability, since it relies on accelerating electrons), I'm not buying it and I don't think anyone else should either.

If it requires high refresh rates to avoid image flicker, I'm not buying it EITHER. Crappy tech flickers. Good tech (LCD) does not.

And what about burn-in? It still uses phosphors. And what about weight? It's going to need to be fairly thick methinks if it's containing a vacuum.
 
SED is very promising but the lack of news and poor showing at CES smells of pre-production difficulties.

I think LED-backed-LCD will make a dent in the high end of the market before SEDs become common. They are inherently more expensive to produce, but are based on existing LCD tech and 3rd-party IP which should keep initial costs relatively low - if NEC can bring out a FOAK high-res LCD/LED monitor that costs $7000, surely a sub-$5k CE display is a possibility this year. I doubt there are serious difficulties to overcome in scaling the LED matrix beyond 30".
 
Last edited by a moderator:
SED is late, OLED is the future. Hell I agree with MuFu, I'd rather have an LCD with LED backlighting.
 
nelg said:
Apparently if you supply enough current to anything organic it will glow.

Apparently if you supply enough power to the right organic mix, little crawly swishy things start coming out.
 
Guden Oden said:
If it forms static magnetic fields around the screen/between screen and user, and/or emits X-rays (high probability, since it relies on accelerating electrons), I'm not buying it and I don't think anyone else should either.

If it requires high refresh rates to avoid image flicker, I'm not buying it EITHER. Crappy tech flickers. Good tech (LCD) does not.

And what about burn-in? It still uses phosphors. And what about weight? It's going to need to be fairly thick methinks if it's containing a vacuum.


it's easy to get rid of all flicker on a CRT : just run it at 100Hz. but SED should not flicker as you have discrete subpixels instead of three electron guns panning the screen. though I don't know if the subpixel are always on or they make them flicker at 100Hz or more.

I don't care at all about radiation and CRT radiates most to the back anyway.

as for burnin, I only saw it on really old text-only or CGA monochrome monitors which ran the same DOS app for a decade :), or on a 25 year old jukebox with TV screen. No problem with a typical 15 year old TV or VGA monitor
 
Blazkowicz_ said:
it's easy to get rid of all flicker on a CRT : just run it at 100Hz. but SED should not flicker as you have discrete subpixels instead of three electron guns panning the screen. though I don't know if the subpixel are always on or they make them flicker at 100Hz or more.

I don't care at all about radiation and CRT radiates most to the back anyway.

as for burnin, I only saw it on really old text-only or CGA monochrome monitors which ran the same DOS app for a decade :), or on a 25 year old jukebox with TV screen. No problem with a typical 15 year old TV or VGA monitor


The thing with SED is that it scans line by line like CRT TVs - you can see this if you take a picture or a video of it, a Plasma or LCD will show a full picture, a CRT or SED will show part of the picture, because the panels scan the whole screen slower than the aperture of the camera, in brief.
100Hz will help with the flickering obviously, and in the end these panels will give such great image quality (if they deliver on the promises obviously), that i don't think anyone will or should complain.
I've only had my HDTV for a couple of weeks and i'm already thinking about the future, when i'll sell it for a SED or for "something else". Shame it seems that these new panels will only be VERY big and i can't have a huge TV at home cause it would look out of place.
 
Blazkowicz_ said:
as for burnin, I only saw it on really old text-only or CGA monochrome monitors which ran the same DOS app for a decade :), or on a 25 year old jukebox with TV screen. No problem with a typical 15 year old TV or VGA monitor

and i've seen too many 'press control-alt-del to login' burns.
 
Blazkowicz_ said:
it's easy to get rid of all flicker on a CRT : just run it at 100Hz. but SED should not flicker as you have discrete subpixels instead of three electron guns panning the screen. though I don't know if the subpixel are always on or they make them flicker at 100Hz or more.

I don't care at all about radiation and CRT radiates most to the back anyway.

as for burnin, I only saw it on really old text-only or CGA monochrome monitors which ran the same DOS app for a decade :), or on a 25 year old jukebox with TV screen. No problem with a typical 15 year old TV or VGA monitor
Any technology (CRT, SED, etc) which shoots electrons at a phosphorus coated screen will flicker. The intensity and frequency of flickering is entirely dependent on the decay time of the phosphor and the rate at which electrons are shot at the screen.

Changing the decay time has certain advantages and disadvantages. A long decay time means that you don't need to refresh the screen as often to avoid flickering, but that images on the screen will ghost as it takes longer for any given pixel to stop glowing. short decay times means that pixels will stop glowing quickly so you get fast response times, but you need to refresh the screen more often otherwise it will be easy to notice a bright pixel constantly being refreshed, decaying, and refreshing again (in otherwords, flickering).

SED displays are the same as CRTs in this regard. You have tradeoffs between the refresh rate and the decay rate of the phosphor. Having a short phosphor decay and a high refresh rate is optimal, but it also means that you need more throughput to the screen, and a higher framerate to avoid tearing. LCD displays have "always on" pixels so the refresh rate for LCDs can pretty much be set to any rate, but they are still limited in how quickly the LCD elements can change state, just like how quickly the phosphor decay is for a CRT monitor. Currently CRT phosphor decay is pretty fast, which is why CRTs have recently had less problems with ghosting than LCDs. On the other hand, LCDs have a certain long term advantage in that their response times aren't tied to the refresh rates like they are with CRTs. A very very fast phosphor decay would mean you'd need very high (to a point) refresh rates on a CRT to avoid flicker. LCDs would not suffer this problem.

Nite_Hawk
 
Nite_Hawk said:
Any technology (CRT, SED, etc) which shoots electrons at a phosphorus coated screen will flicker. The intensity and frequency of flickering is entirely dependent on the decay time of the phosphor and the rate at which electrons are shot at the screen.
That is not true for SED if you use some kind of memory element that keeps the electron ray at a constant (low) level for the whole refresh cycle. Then you can use the fastest-decaying phosphors for an image free of ghosting and flickering.
 
OLED apparently is a giant power hog, even worse than plasma.
Where on earth did you hear that? Most all indications I've seen about OLED is that it is significantly lower power than anything out there. You think they'd use OLED displays on cell phones if they were power hogs? OLED backlights for LCD screens were used mainly because they are lower power and run cooler than any CCFL backlight. It'll also prove to be the lightest weight display because it is self-luminescent and doesn't require a backlight like LCD.

The main problem with OLED is lifespan. No one has been able to make them consistent across all 3 colors as well as create something that lasts a reasonably good amount of time. That's why for now, they're relegated to small screens for cell phones and digital cameras.

SED is also lower power consumption than LCD, but apparently only above a certain size. When you start scaling sizes down, LCD drops off in power faster (backlights being the biggest factor, I'd expect).

The thing with SED is that it scans line by line like CRT TVs - you can see this if you take a picture or a video of it, a Plasma or LCD will show a full picture, a CRT or SED will show part of the picture, because the panels scan the whole screen slower than the aperture of the camera, in brief.
AFAICT, SED scans using a transistor matrix just the same as LCD does. Since it has a gun for every pixel, that only makes sense.
 
Xmas said:
That is not true for SED if you use some kind of memory element that keeps the electron ray at a constant (low) level for the whole refresh cycle. Then you can use the fastest-decaying phosphors for an image free of ghosting and flickering.
Well, it's technically still true because all you are doing here is speeding up the rate at which electrons hit each pixel element. It just means that the flicker would be much much shorter, and much much less intense. :)

Still, a number of reviews seem to indicate that the SED displays they've seen still have scanlines which is a bit odd. Perhaps there are some limitations to shooting out a constant lower power stream of electrons (heat, efficiency, etc).

Edit: I should also mention that current gen LCDs use fluourescent backlights which flicker too, but it's generally fast enough that you don't notice it. It's not a limitation of LCD technology though, just the backlight method.

Nite_Hawk
 
Last edited by a moderator:
Nite_Hawk said:
Well, it's technically still true because all you are doing here is speeding up the rate at which electrons hit each pixel element. It just means that the flicker would be much much shorter, and much much less intense. :)
But on that scale you could say light flickers ;)
 
RussSchultz said:
OLED apparently is a giant power hog, even worse than plasma.

? link?

My understanding is that early reports were that OLED would be a power saving technology although the savings in practice over LCD have been negligable.
 
Blazkowicz_ said:
it's easy to get rid of all flicker on a CRT : just run it at 100Hz. but SED should not flicker as you have discrete subpixels instead of three electron guns panning the screen. though I don't know if the subpixel are always on or they make them flicker at 100Hz or more.

I don't care at all about radiation and CRT radiates most to the back anyway.

as for burnin, I only saw it on really old text-only or CGA monochrome monitors which ran the same DOS app for a decade :), or on a 25 year old jukebox with TV screen. No problem with a typical 15 year old TV or VGA monitor
QFT
Guden Oden is like the media= trying to scare everyone with things blown way out of porportion.
I only need 75 hz to be flickerfree on my 19 crt btw.
I don't have any burn in on my crt either :rolleyes:
I really like the bit about how good technology doesn't flicker, first off 75 hz and above should get rid of it for most people, second off LCD in it's current form isn't good technology, it's just thin so everyone goes googoo over it.
Good technology doesn't look good at only one or two resolutions :p
 
Last edited by a moderator:
Back
Top