OK, here I go. Take a deep breath, Ed:

Let's remember that AT&T's standard (which is now referred to as 568B or WECO) was developed in the early 1970s, before anyone else was a huge player with regard to system manufacture. Sure, there were other manufacturers, but none had the dedicated marketing power that AT&T had with their Bell companies in nearly every major city. (Sorry for the side-track).

AT&T needed an 8P8C solution to support their MET sets (Modular Electronic Telephone) on Dimension and Horizon PBXs, so they developed it on their own. They later transitioned this standard to their Merlin key telephone systems in the early 80s. This wiring was never intended to support anything but their own proprietary telephone systems and it had absolutely nothing to do with computer networks. Voice was on pair one (pins 4/5), data was on pins 3/6 and power was on pins 1/2. This was done to prevent any harm by someone plugging a standard phone cord into a MET jack since no pins in the plug would engage with both power pins, which were effectively out of reach for a standard 6P4C plug. Pretty smart, huh?

As Sam said, the computer networks at this time were on everything from COAX to TwinAX to you name it.

THEN came a problem with the divestiture of the Bell System in 1984. The Universal Service Order Code (USOC) was developed so that a customer could order a jack from any local telco using a standard 'RJ' designation. From Miami to Seattle and all points in between, you would be guaranteed to receive the same jack wired the way it was ordered. RJ11 for one line, RJ14 for two, RJ25 for three and RJ61 for four.

USOC jacks were wired from the center two pins outward, meaning that line one was on the center two pins, line two was on the next pins outward from center, line three was on the next two outward pins, and so on. It was fairly simple.

At some point, Ethernet wiring for computer networks transitioned over from COAX to 50-ohm unshielded twisted pairs, and in the early days, CAT3 cable was more than sufficient to support 10 Mbps transmission speeds. Only two pairs are needed for Ethernet over twisted pair (transmit and receive). How does the industry adapt to the obvious need for a modular jack that can be convenient for the user, yet ensure that nothing will blow up if they plug into the incorrect jack?

There is when the problem arose. Stick with existing USOC wiring patterns could potentially put CO line voltages at 48 volts or more on pins of a jack that are meant for data connections. They built in a safeguard under the USOC program where all of the tips on lines 2,3 and 4 were to left of center and the rings were to the right. This meant that anything plugged into a USOC jack wouldn't be harmed under the emerging Ethernet or the existing AT&T (later 568B) standard. Man, they sure had to put a lot of thought into this process, didn't they?

I'm guessing that at this point, they had to figure out the least-dangerous pins to use if a computer got plugged into an RJ61X jack. Someone selected pins 1,2,3 and 6 as AT&T had previoiusly used them for their power and data connections on pins that were typically out-of-bounds.

So then came the issue of merging USOC, Ethernet and AT&T wiring standards on the same 8P8C jack. Seriously, some equipment damage can happen if someone plugs into the wrong jack. At first, the solution was keyed plugs and jacks. Data cables and jacks had a slot/tab on the left side that prevented the cables from being inserted into a voice jack that didn't have this keyway. That worked for a little while, but even that wasn't the solution.

Well, sort of. Earlier in the development of the USOC program, they already addressed this by mandating keyed jacks/plugs for telco data circuits that were often wired as RJ41S or RJ45S. The telco would put in a jack to terminate a data circuit, intending for it to be used for a modem with a keyed plug. This kept people from getting into trouble until twisted pair Ethernet came along in the late 80s. An unkeyed voice or Ethernet cable could still be plugged into a keyed USOC RJ41 or RJ45 telco jack.

So, back to the original subject: USOC was the national standard that telcos used, with pair one on pins 4/5, pair two on pins 3/6 and so on. Since there are hundreds of telcos in North America, many independent of AT&T, they had no knowledge of AT&T's own wiring standard on an 8P8C jack, so they maintained and standardized with today's 568A standard (remember the tips on the left thing I mentioned with the RJ61X configuration earlier).

At the end of the day, AT&T and it's Bell System companies maintained their standard and they weren't being told to change by anyone. At the same time, the independent (non-Bell) telcos were sticking with USOC standards. Something had to be done to bridge this gap. There was no changing the USOC standard; that was mandated by the FCC. AT&T stuck with what is now 568B and the independents went with their own 568A.

Truth is, 568A and B are exactly the same if they're wired using the same pattern at each end of the cable. It's only when a cable is terminated with A on one end and B at the other end where there will be a problem. As long as a site is wired using a uniform standard, nobody will notice a thing.

The answer to the original question is that the 568A and 568B standards were created as a result of two very powerful bodies refusing to change. Their refusal to change didn't net anything at all. Both systems work excactly the same.

Oh, and as for that RJ41/RJ45 reference made earlier: Those no longer exist. Telcos now deliver circuits on RJ48 jacks. CGs have stolen the RJ45 nomenclature to any network jack. I guess that's OK. My doctor told me to quit stressing about it.


Ed Vaughn, MBSWWYPBX