What Are You Doing Right Now
-
@scottalanmiller said in What Are You Doing Right Now:
@NerdyDad said in What Are You Doing Right Now:
It covered modern Windows OSs from XP to now...
So by "now", you mean Windows 10? Did it cover iOS or Android? Linux desktops or Mac? And why would XP be on there, even Vista is a ridiculous thing to have been testing on. And I'd not be happy with it having anything older than 8.1 on a current exam. The best modern bench tech easily will never see XP in his entire career today.
It mentioned Mac and Linux in brief but did not go into how to set an IP address or to check network configs in either one as they say that there are too many different ways to go about it in each version and they didn't have the time nor want to confuse people about how to set it.
No mention of Windows 10.
As far as virtualization, they didn't mention how it worked but just how to use it within a single PC.
-
@scottalanmiller said in What Are You Doing Right Now:
@NerdyDad said in What Are You Doing Right Now:
@scottalanmiller said in What Are You Doing Right Now:
@Tim_G said in What Are You Doing Right Now:
But again, as NerdyDad pointed out... how many places now a days have someone just standing at a bench fixing PCs all day?
It's either scrap it if it doesn't work, or reimage it... LOL
SAM's Revised One Question A+
Question: A customer comes to you with a laptop, they say that things are not working and they need it fixed.
Do you:
A ) Run a diagnostic script on it to determine what is wrong?
B ) Interrogate the user to figure out what "it is not working" means?
C ) Sacrifice a goat to the bench gods for guidance?
D ) Rapidly re-image the machine to ensure that it is clear, working and malware free?The tech side says A) to get to the problem or possibly recover data before going to D). If D fails, I would go to C.
A is not a tech decision, that's an unscrupulous business decision to fleece customers of money for expensive diagnostics. That's the one lesson I'd want the A+ to teach, don't waste time fixing things that are not designed to be fixed. Once in a while it will make sense, but so rarely that it's widely considered unethical to promote doing it.
The only thing that I would do before reimaging is to extract needed data and scan extracted data for malicious software before bringing the data into the reimaged computer. Otherwise, reimage and don't even worry about finding the root cause. More than likely you won't find it.
-
@scottalanmiller said in What Are You Doing Right Now:
@NerdyDad said in What Are You Doing Right Now:
It covered modern Windows OSs from XP to now...
The best modern bench tech easily will never see XP in his entire career today.
Well, ideally... you would hope so anyways.
-
@Tim_G said in What Are You Doing Right Now:
@scottalanmiller said in What Are You Doing Right Now:
@NerdyDad said in What Are You Doing Right Now:
It covered modern Windows OSs from XP to now...
The best modern bench tech easily will never see XP in his entire career today.
Well, ideally... you would hope so anyways.
"Ideally" is the key word here. In Scott's perfect vision of "modern" bench techery, one would not likely see XP, but put that tech in anytown, USA, and they will be up to their ears in XP, Vista, and other horrifyingly ancient technologies. Fact.
-
@NerdyDad said in What Are You Doing Right Now:
It mentioned Mac and Linux...
No mention of Windows 10.So not exactly Windows XP to now, but Windows XP to "relatively current?" A cert like the A+ should listing operating system and training on them as soon as they are in beta, looking forward, not backward. Windows 10 was the middle of 2015 for production release and so should have been the focus on the A+ in a 2014 revision for sure and likely in a 2013 one. XP should have been dropped around 20010, realistically.
So even if you feel it has improved a lot, and it sounds like it did since XP would have been the newest thing mentioned when I took it rather than the oldest, it's still not sounding very practical.
-
@Tim_G said in What Are You Doing Right Now:
@scottalanmiller said in What Are You Doing Right Now:
@NerdyDad said in What Are You Doing Right Now:
It covered modern Windows OSs from XP to now...
The best modern bench tech easily will never see XP in his entire career today.
Well, ideally... you would hope so anyways.
Not saying ideally, just saying "commonly."
-
If its not supported, then it shouldn't be covered. Vista is about to be unsupported, therefore, it should fall out of the curriculum here soon.
-
@RojoLoco said in What Are You Doing Right Now:
@Tim_G said in What Are You Doing Right Now:
@scottalanmiller said in What Are You Doing Right Now:
@NerdyDad said in What Are You Doing Right Now:
It covered modern Windows OSs from XP to now...
The best modern bench tech easily will never see XP in his entire career today.
Well, ideally... you would hope so anyways.
"Ideally" is the key word here. In Scott's perfect vision of "modern" bench techery, one would not likely see XP, but put that tech in anytown, USA, and they will be up to their ears in XP, Vista, and other horrifyingly ancient technologies. Fact.
I'm not suggesting anything like ideally. I'm saying that in the real world, you will often get work after your cert process and work an entire career without ever seeing that old stuff. Sure, lots of other people will see old things, I'm not suggesting otherwise. And I'm not injecting any feeling of what ideal is or isn't. Only stating the very real fact that a large percentage of people getting any cert will never see 14 year old technology, ever, in their career. If their very first job doesn't expose them to it, they will easily be at 17-20 years past that tech before going to their second job. It is REALLY trivial to never see old OSes, for example, even for people working in bench.
-
@NerdyDad said in What Are You Doing Right Now:
If its not supported, then it shouldn't be covered. Vista is about to be unsupported, therefore, it should fall out of the curriculum here soon.
I'm not sure that Microsoft's own support decisions should be the basis for this type of cert. The goals might end up lining up, but might not.
I get that there is a desire to expose people to ancient stuff. But only a certain percentage of people will ever be affected by that in the real world of work. Especially if they don't get a bench job immediately following the cert process.
-
Instead of exposing them to mainstream OS's, why don't we teach them how to find the answers to the questions that they are looking for? Oh, wait, our public education system is supposed to be teaching them that.
-
Imagine if you took the A+ today and it covered XP. Then imagine it takes six months to find work, that's the start of 2018, basically, one and a half decades after XP released. Now imagine your first job isn't Best Buy, hopefully it is not, but instead working on the bench in a mid-sized company. Let's say a grocery store chain (I've seen this exact job so choosing it specifically.) Even if that chain runs old stuff, there is a really good chance that Windows 7 is the oldest thing you will ever need to work on during that time, and even that might be rare. The chances that XP are left in production AND need supported by the bench team is pretty much zero, even today. Can happen, but "can happen" isn't a good thing to base your certification process around. Also, being versed on Windows 7 makes you decently useful on XP. So not like you can't work on it anyway, especially since you can Google most answers. Essentially zero value to teaching old interfaces.
Add to that that even if 10% of all repairs or work coming in is XP, there is a very high chance that there is someone working in that shop that has had years of XP experience that you will never compete with and likely they will keep working, doing the XP work until long after the XP machines are no more.
There are many factors to this, but bottom line is that teaching ancient tech, even if we expect 50% of certified "grads" to see it at some point in their career, makes little to no sense - especially when you consider that all education has to come at the expense of something else . So in this case, they learned XP instead of Windows 10!
-
@scottalanmiller said in What Are You Doing Right Now:
@NerdyDad said in What Are You Doing Right Now:
If its not supported, then it shouldn't be covered. Vista is about to be unsupported, therefore, it should fall out of the curriculum here soon.
I'm not sure that Microsoft's own support decisions should be the basis for this type of cert. The goals might end up lining up, but might not.
I get that there is a desire to expose people to ancient stuff. But only a certain percentage of people will ever be affected by that in the real world of work. Especially if they don't get a bench job immediately following the cert process.
I don't see a point in anything prior to Win7. You don't need to learn XP or Vista to learn about Win7, 8.x, or 10. If anything, you should learn Win10 first, so you can understand why Win10 should be used instead of anything prior. You should also have exposure (through the A+) to OSX, ios, Android, ChromeOS, Win10s, CentOS, and Ubuntu equally... just as you would in the Windows versions it covers.
I also don't see a point in learning about a PentiumII processor. You don't have to know anything about those to learn about an i7... same with all PC components. Learn about the new...
-
It's like when I took my A+... they taught Windows 3.11. Because I went into IT, not into bench (as CompTIA promoted the cert was for - I did what I was supposed to do) and because I worked for actual businesses that took IT seriously not only did I never see Windows 3.1 or 3.11 or DOS or Windows 95 or even Windows 98 or ME. None of those ever existed to me in the post-A+ work place (or even at home.) The business world was running Windows NT 4, Linux, OS/2 and Solaris everywhere that I went for years to come. Big firms, small ones didn't matter. Two person shops to Fortune 100, very saw anything from the A+.
It's not that those things never existed, they did. But the focus was not on IT but consumer home use. And on consumer home use from a different era altogether. And even at home, I didn't see those things. Not at my home or most of my friends. A few had Windows 98 and one had ME, but I never worked on them. That was just gaming machines.
Between the extreme age of the material, and the focus on non-business use cases, it meant that without even trying, no one that I knew taking the A+ ever saw the stuff from it in the real world. And had we had this conversation in 1998, people would have pointed out that Windows 3.11 was only five years old (not 15 like XP) and that computers coming in for repair at the corner shop would see 3.1, 3.11 and 95 all the time, and they would have been right. But I never saw one of those shops hire anyone that took the A+. Those shops were already stocked with old timers that had worked on those things for years. Those of us getting certified were heading into IT careers and working with emerging technology that we had just as much exposure to as the old timers, as it was new.
-
@Tim_G said in What Are You Doing Right Now:
I also don't see a point in learning about a PentiumII processor. You don't have to know anything about those to learn about an i7... same with all PC components. Learn about the new...
OMG, I forgot the insane amount of stuff that I had to memorize on 386, 486 and original Pentium processors! Not a single one have I seen yet in use since that exam!
-
A full HALF of that exam was memorizing things like serial port settings. Things I've never once used. Even at a conceptual level, nearly everything from the A+ was useless. Nearly everything from the one that I took was totally focused on the DOS specific implementation of settings, nothing that was generally applicable to the field. So none of it was valuable for me to apply elsewhere. All just wasted.
-
No question that the A+ has its problems (maybe nothing else.) So more constructive might be... what should the A+ have on it?
-
@StrongBad said in What Are You Doing Right Now:
No question that the A+ has its problems (maybe nothing else.) So more constructive might be... what should the A+ have on it?
I think they should retire the A+ and come up with something new... like: CompTIA Tech+ or something.
-
@Tim_G said in What Are You Doing Right Now:
@StrongBad said in What Are You Doing Right Now:
No question that the A+ has its problems (maybe nothing else.) So more constructive might be... what should the A+ have on it?
I think they should retire the A+ and come up with something new... like: CompTIA Tech+ or something.
I would agree with that. Make an annual bench exam called the CompTIA Bench+. It would, of course, be called the B+ by everyone from day one because, duh.
But let's be honest, even the NAME A+ is ridiculous and hearkens to a comparison to a auto shop mechanic certification or something. It's a totally useless name that should warn people of what the exam must be.
What would a Tech+ exam be like? Is IT entry level cert really a useful thing? Why not just make the Net+ be seen as the proper exam for that?
-
@StrongBad said in What Are You Doing Right Now:
No question that the A+ has its problems (maybe nothing else.) So more constructive might be... what should the A+ have on it?
Maybe there should be the "Helpdesk+".
-
Doing some work with MailChimp. First time working with them.