To be honest, I really have never heard of Teco but that's probably because I live in the US. That being said, the problem you are having could be the result of poor manufacturing on a budget TV set (as much as you probably don't want to hear it). The best thing to do is just run it using whatever connection looks the best to you (or return the TV). Your TV's native resolution is 1366x768 as per google. Not that no digital TV has an "interlaced" native resolution. CRTs were natively interlaced, but digital televisions such as LCDs are always progressive (IE: they draw full pixel rasters every time they refresh). Any interlaced signal that is being sent into the TV (IE: 1080i) will be de-interlaced by the TV before it is displayed. Any signal that does not match the TVs native resolution will be scaled to fit the native resolution.
In any case, if you like the picture with 1080i, than use 1080i! Your set may just be better at downconverting a 1920x1080 raster to it's native resolution over HDMI (as opposed to up-converting 1280x720 - IE: 720p). Why this is occurring is a question for the manufacturer (again who I've never head of).
Also, for the record, there are almost no true 720p TVs in existence. True 720p is 1280x720 as a standard. However, there were only a few TVs ever produced with this particular pixel structure. The industry seemed to go the way of the 1366x768 panels for whatever reason (they were probably cheaper to produce). Almost every LCD that you see advertised as 720p will be this resolution. Almost every plasma will be something non standard with rectangular pixels. They only thing that's really important is what your eyes tell you
For 1080p TVs it's much different. Almost every 1080p LCD has a true 1920x1080 panel. I can't think of any with non standard panels off the top of my head. Furthermore, most plasmas that are 1080p also use a true 1080p pixel structure with square pixels (instead of rectangular ones used in lower rez sets).
Thanks for your response, i think you might be right. Ill just have to leave it in 1080i. The reason i didnt want to run in this was because i was under the impression that it limits everything to a max of 30 fps. Will this hold true for me even if the TV is converting the signal into its native res? I tested this with COD4 which i know runs at a solid 60 fps but I couldnt notice any difference in frame rate between component 720p or HDMI 1080i. The picture was much better under HDMI though. Its a really strange issue, youd have thought that it would have been easier for the tv to run 720p than 1080i but it must just be the way the TV is.