Sunday, 28 February 2016

The Ten-Year Network Development Plan

The 3rd EU Energy Package comprises the development of a biannual, non-binding Ten-Year Network Development Plan (TYNDP). It mandated ENTSO-E to lead the development and publication of the TYNDP. The TYNDP is designed to increase information and transparency regarding investment in electricity transmission systems required on a pan-European basis and to support decision-making processes at regional and European level. The first pilot TYNDP was published in June 2010 and the TYNDP 2012 was published in July 2012. The next one is expected in 2014.

Drawing up the TYNDP requires the analysis of the expansion demands of the electricity networks:
  • The legal requirements for the safe operation of electricity networks
  • Current developments in the structure of the generation system
  • The development of renewable energy
  • Demand
  • Short-term and medium-term energy policy goals and strategies for reorganising the energy system.
The results of the working group on System Adequacy and Market Modelling (SAMM) will also be referred to by ENTSO-E for the development of the structure of the power supply system. The working group publishes the report ‘Scenario Outlook & Adequacy Forecasts’ and associated scenarios with key data relating to the European power supply system every two years.
In the results, the demand for network development is set against the current and planned expansion projects and indicates potential investment gaps. Based on the analysis, the community-wide network development plan generates proposals for investment in the network infrastructure for the 34 European countries, so-called ‘Projects of overriding European importance’. The TYNDP working group from ENTSO-E coordinates the preparation, coordination and consultation of the TYNDP. Existing national planning for expansion of the transmission network is incorporated into preparation of the TYNDP.
The preparation of the draft TYNDP is accompanied by a continual round of discussions (e.g. on the underlying scenarios) amongst a broad stakeholder group. After completion of the draft, a public consultation takes place to enable all participants to make statements in the development process. The European Agency for the Cooperation of Energy Regulators (ACER) and the national regulatory authorities examine the draft to ensure coherence with the national network development plans.

Friday, 26 February 2016

Skip the Crowds: Online Shopping Tips for Black Friday






You probably already know that many Black Friday bargains are available online too. Rather than brave the elements or risk being trampled at 5 a.m., it's much more pleasant to shop from the comfort of home. Save the Kevlar vest for another day.
We asked a few expert bargain-hunters how online shoppers can score the best deals on Black Friday. Here's their advice:
Dan de Grandpre, editor in chief of dealnews:
  • Doorbusters, those incredibly cheap deals designed to get people in the door, may be available online too. Retailers often say they won't sell doorbusters online, but that's not always true. Last year, for instance, Wal-Mart offered all of its doorbusters online--but not until later in the day on Friday.
  • Don't give up the savvy shopping techniques you'd use all year just because it's Black Friday.
  • Many Black Friday deals aren't in fact great deals. You'll find better prices online. Use price-comparison sites like PriceGrabber and Shopping.com to be sure you're getting the best deal.
Michael Brim, president of Black Friday deal site BFAds :
  • Some retailers jump the gun and start their online sales as early as midday Thanksgiving. Yes, while you're eating. (You could always excuse yourself from the table to nab that $78 Blu-ray player.)
  • Then again, many sales start between midnight and 3 a.m. (Eastern) on Friday. If you have trouble staying up, try a quick nap after dinner.
  • Be aggressive. You can't wake up at noon Friday and expect everything to be in stock.
  • How do you know when online sale goes live? Well, you could reload a retailer's page for hours. Another option is to monitor Black Friday sites like BFAds. They'll let you know when sales are live.
John Dunkin, founder of iBlackFriday.com:
  • Plan ahead. Go to a retailer's site early, pick out everything you want, and add it to your shopping cart.
  • Next, log into your account on Thanksgiving Day to see if your products are available at Black Friday prices.
  • You can get pretty much everything you want online. If you don't need a doorbuster item, don't bother going to the brick-and-mortar store.

courtesy by : PC World

check our Facebook page 'Memes Channel" : Memes Channel and catch a few laughs 


Wednesday, 24 February 2016

Internet bot







An Internet bot, also known as web robotWWW robot or simply bot, is a software application that runs automated tasks (scripts) over the Internet. Typically, bots perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human alone. The largest use of bots is in web spidering, in which an automated script fetches, analyzes and files information from web servers at many times the speed of a human.
Given the exceptional speed with which bots can perform their relatively simple routines, bots may also be implemented where a response speed faster than that of humans is required. Common examples including gaming bots, whereby a player achieves a significant advantage by implementing some repetitive routine with the use of a bot rather than manually, or auction-site robots, where last-minute bid-placing speed may determine who places the winning bid - using a bot to place counterbids affords a significant advantage over bids placed manually.
Bots are routinely used on the internet where the emulation of human activity is required, for example chat bots. A simple question and answer exchange online may appear to be with another person, when in fact it is simply with a bot.
While bots are often used to simply automate a repetitive online interaction, their ability to mimic actual human conversation and avoid detection has resulted in the use of bots as tools of covert manipulation. On the internet today bots are used to artificially alter, disrupt or even silence legitimate online conversations. Bots are sometimes implemented, for example, to overwhelm the discussion of some topic which the bot's creator wishes to silence. The bot may achieve this by drowning out a legitimate conversation with repetitive bot-placed posts which may in some cases appear to be reasonable and relevant, in others simply unrelated or nonsense chatter, or alternatively by overwhelming the target website's server with constant, repetitive, pointless bot-placed posts. These bots play an important role in modifying, confusing and silencing conversations about, and the dissemination of, real information regarding sensitive events around the world.
The success of bots may be largely due to the very real difficulty in identifying the difference between an online interaction with a bot versus a live human. Given that bots are relatively simple to create and implement, they are a very powerful tool with the potential to influence every segment of the world-wide web.
Efforts by servers hosting websites to counteract bots vary. Servers may choose to outline rules on the behavior of internet bots by implementing a robots.txt file: this file is simply text stating the rules governing a bot's behavior on that server. Any bot interacting with (or 'spidering') any server that does not follow these rules should, in theory, be denied access to, or removed from, the affected website. If the only rule implementation by a server is a posted text file with no associated program/software/app, then adhering to those rules is entirely voluntary - in reality there is no way to enforce those rules, or even to ensure that a bot's creator or implementer acknowledges, or even reads, the robots.txt file contents.

continue reading here : https://en.wikipedia.org/wiki/Internet_bot



Monday, 22 February 2016

Robots in hospitals

Robots in hospitals can be quite handy. For example, the one you can see in this picture. He is holding a bag and looks quite cute, but his main skill is the ability to navigate around the hospital. So he can help patients to find their way to the needed place, as well as carry some stuff around.

Of course, there are some more serious robots in hospitals too. I am talking about surgical robots, but this article is focused on machines that can ease the day-to-day work in a hospital. So, if you are interested in surgical robots you should check out my article on surgical robots.
OK, back to diligent hospital workers. I like to divide everything into types, categories, sections and so on. This article is not an exception. I will try to shed some light on robots in hospitals by discussing, more or less separately, the different kinds of robots that operate (or will operate) in hospitals.


courtesy by : allonrobots




Saturday, 20 February 2016

History of Network Security





Since the inception of networked computers, security has been a concern. Before the 90s, networks were relatively uncommon and the general public was not made-up of heavy internet users. During these times, security was not as critical - however, with more and more sensitive information being placed on networks, it would grow in importance.
During the 70s and 80s, researchers with access to the "internet" enjoyed playing practical jokes on each other through the network. These jokes were harmless, but nonetheless, exposed flaws in the security of the ARPANET (forerunner to today's internet). The network, at this time, was small and many users knew each other in their professional careers - which limited the risk and threat.
There were several high profile violations of security during this time - including the popularization of the term "hacker" by the movie War Games - but few serious steps taken to limit the damage, other than Congressional Hearings and proposed legislation.

Growing Threats

In the late-80s, use of the network began to grow quickly. With universities, government and military installations connecting, the need for security was growing. The first automated worm appeared on the ARPANET in 1988. The "Morris Worm", developed by a student at Cornell, could exploit the lack of intrusion prevention system and connect to another computer, use vulnerabilities to copy itself, and send itself to a new location. The self-replicating Morris Worm did much to expose the vulnerabilities of networked computers - using so many resources that infected computers were rendered inoperable, and spreading quickly throughout the network. At this point, influential leaders in the network decided to begin developing countermeasures against network threats.

Formation of CERT

The government agencies in charge of developing ARPANET worked with other users of the network to develop the Computer Emergency Response Team (CERT) - the first network security organization. Formed in 1988, CERT actively spread awareness of security protocols and researched ways to mitigate and altogether prevent breaches. As the internet population exploded and the commercial use restriction of ARPANET (now known simply as the Internet) was removed, networks became a highly appealing target for "hackers" around the world.
In the mid-90s, the number of potential threats had multiplied exponentially, necessitating the creation of anti-virus and firewall programs to protect computers. These mass-produced protections were necessary, as the number of threats and users had grown too large to handle with custom measures and teams.

Firewalls

Generally, the first firewall is credited to a researcher at a NASA center in California. After being attacked by a virus in 1988, they developed a virtual version of the "firewall" used in physical structures to prevent fires from spreading to other parts of a building or complex. Using routers to separate networks into smaller networks, the firewalls prevented an attack from infecting all the computers on a network at the same time. Over time, these firewalls have developed into highly sophisticated tools that monitor and prevent potentially malicious requests or traffic from reaching systems and networks. Firewalls now check requests and impose restrictions on data at multiple levels, including the web application level.

Anti-Virus Protection

The other late-80s development in internet security was the anti-virus program. Numerous companies and programmers developed software that would "clean" infected computers after they contracted a virus. While many of the viruses in the late-80s period were not necessarily malicious, as time went on, anti-virus protection became a mandatory addition to any network that wanted to remain secure. The viruses of the 90s and beyond were more malicious in nature, designed to infiltrate and steal, destroy or otherwise disable networks.

How Radware Solutions Protect Networks

Radware solutions offer protection from these sorts of network intrusions, featuring a wide variety of attack mitigation capabilities including Network Behavioral Analysis (NBA), Denial of Service (DoS) protection, reputation engine and web application firewalls (WAF.)


 thanks to :  Radware

Thursday, 18 February 2016

History of Topology








Topological ideas are present in almost all areas of today's mathematics. The subject of topology itself consists of several different branches, such as point set topology, algebraic topology and differential topology, which have relatively little in common. We shall trace the rise of topological concepts in a number of different situations.


Perhaps the first work which deserves to be considered as the beginnings of topology is due to Euler. In 1736 Euler published a paper on the solution of the Königsberg bridge problem entitledSolutio problematis ad geometriam situs pertinentis which translates into English as The solution of a problem relating to the geometry of position. The title itself indicates that Euler was aware that he was dealing with a different type of geometry where distance was not relevant.

check this link out for more : groups

Tuesday, 16 February 2016

11 Things We Learned About 'The Amazing Spider-Man 2'



Spidey is back in theaters today in The Amazing Spider-Man 2, the sequel to 2012’s The Amazing Spider-Man. We found out a little bit about the movie from the director, stars, and producers.

1. THE ENTIRE MOVIE WAS FILMED IN NEW YORK.

Sony Pictures
In fact, it was the largest film to ever shoot in New York StateThe Amazing Spider-Man 2filmed on location in the boroughs of New York City and in Rochester, as well as at studios in Brooklyn and Long Island. Some places to look out for: the Hearst Tower at 57th Street and 8th Avenue, which doubles as Oscorp; Lincoln Center; Union Square; Brooklyn Bridge Park; and Chinatown.

2. SPIDEY’S SUIT LOOKS DIFFERENT THIS TIME AROUND.

The suit in the first movie was designed to look like something a kid from Queens could actually make himself, using materials he could easily procure (the eyes, for example, were made with sunglasses). This time around, director Mark Webb wanted to stick a little bit closer to the suit that’s in the comic books, making the blue darker, and the eyes of the mask white and large. Costume designer Deborah L. Scott—who created Marty McFly’s iconic look inBack to the Future and made the costumes for Titanic—brought this version of the suit to life.

3. PRODUCERS WANTED TO MAKE GWEN STACY A TRUE EQUAL TO PETER PARKER.

Sony Pictures
Producer Matt Tolmach said that in many of the previous Spider-Man movies, the focus has firmly been on Peter Parker’s journey—but it’s different in this film. “The truth is, [Gwen] is driving this story,” he said. “Peter is trying to keep it all together. That’s his struggle. Gwen has a real sense of who she is and what she wants. It’s not that it isn’t complicated but it’s incredibly empowering in a character. She’s making choices.”
Emma Stone, who plays Gwen, agrees. “I love how the relationship evolves in the second movie,” she said. “The clarity and maturity that Gwen has sort of achieved—I think because of the death of her father, honestly—has brought her life in sharp focus. So she’s really following her destiny. I think that’s one of the most inspiring parts of their relationship is that it is two incredibly equal parties.”
“When the comics were written—in the ‘50s and ‘60s—women didn’t really have much of a role in comics,” producer Avi Arad said. “They were supposed to look good and stay on the side and we are all very proud we were able to change [that] completely.” And a lot of that credit, he said, lies with Stone: “When you have a great actress and you give her the bulk of the material, now you have a real scene. You don’t just have someone screaming. When you have someone like that, you better make it a two person act all the time.”

4. THE CREW BUILT A REPLICA OF TIMES SQUARE.

For the scene where Spider-Man faces off against Electro for the first time, the crew shot for a couple of nights in Times Square—and then built a replica set at the studio in Long Island. “The logistical obligations of that scene were so complex, [that] we had to, and we could, amazingly,” Webb says. “I remember that scene came up in the script and we worked on it a little bit and I was sorta denying myself the pain/fear of how it was actually going to be shot.” The replica included the red TKTS stairs, recreations of storefronts, Father Duffy Square, and numerous Jumbotron scenes (the rest of the area was added later using CG).
But even though building the replica set allowed the crew the control they needed, it still wasn’t easy. “It was a very difficult thing, just in terms of bringing the amount of lights that were required, the amount of cement that was required,” Webb says. “Our production director did a really extraordinary thing, and it was a huge spectacle. There were explosions and extras and all that stuff.”

5. BOTH JAMIE FOXX AND DANE DEHAAN HAD TO UNDERGO A PHYSICAL TRANSFORMATION TO PLAY VILLAINS.

Sony Pictures
Foxx wore 21 thin silicone facial prosthetics—which better mimic the quality of skin than foam prosthetics—to transform from Oscorp employee Max Dillion into Electro. The look was designed by The Walking Dead’s Greg Nicotero of NBB EFX Group and finalized by special effects makeup artist Howard Berger. “It was like taking me and dipping me into blue candle wax like four hours,” he said. It was also his idea to give Max a combover. “My sister is my hair stylist and she created the ‘Django’ look; Ray Charles and things like that,” he told Jay Leno. “When I’m the nerd guy, I want to be the first black man with a comb-over. I told her, ‘Make me look like I would look if I never made it.’”
DeHaan, meanwhile, endured 3.5 hours in the makeup chair—donning contacts, teeth, and prosthetics—to play the Green Goblin. “Then there was another hour just to get into the suit,” he said. “I literally had four people using screwdrivers and wrenches getting me into that suit.” Performing in the suit was tough not just because it weighed 50 pounds, DeHaan said, but because of the temperatures on set. “[The] set was at least 110 degrees. They were literally pouring buckets of ice water down my suit in between takes,” he said. “It had evaporated by the time they called action—that’s how hot it was. I think I lost 7 pounds in like two days. Which for me is a high percentage of my body weight.”

6. TO NAIL ELECTRO’S LOOK, THE VFX TEAM STUDIED ELECTRICAL PHENOMENA.

Sony Pictures
After deciding on the right look for the makeup, Foxx said, “[The VFX crew] took it from there. Those guys are geniuses at what they do. [Jerome Chen, Sony Picture Imageworks Visual Effects Supervisor] was like, ‘We got it, we know what we want to do. We want to make a thunderstorm inside your body.’ It’s great to see it all work.” VFX artists made it look as though the electricity was inside of Electro, not just running along the surface of his skin, and watched footage of nighttime thunderstorms and bioluminescent animals and photos of nebulae to achieve the look.
Foxx was thrilled with the way the CGI and practical makeup worked together. “The CGI guys would come out and be there and look at me and take pictures and say ‘stand this way, say this, laugh,’” he explained. “It was really fun. It was like you were back at your crib where you’re looking in the mirror practicing on how to act. When I looked and saw what they did with the CGI, I was like that is incredible because people don’t even know that that is actually me. They think it’s all CGI.”

7. IN ONE SCENE, GARFIELD’S FOOT GOT BRUSHED BY A CAB.

Sony Pictures
Andrew Garfield, who plays Peter/Spidey, has a favorite scene—which Webb and Stone also love—in which Peter and Gwen see each other for the first time in a year. Garfield had the idea that Peter should see her and cross the street, oblivious to all traffic. “[He] talked about cartoons—when the skunk gets a smell and he floats across,” Webb says. “It was that kind of idea.”
Peter Parker might have been oblivious to the traffic, but Garfield didn’t make it through unscathed. “In the take [that was] used, the taxi actually ran over my heel,” he says. “You can see a little facial recognition of that just as I’m about to step onto the pavement. Literally, a tire smacked my heel. It was really scary.”

8. PAUL GIAMATTI WANTED TO PLAY THE RHINO.

Sony Pictures
The actor appeared on Conan O’Brien’s show in 2011 and said that if he could play one character in a Spider-Man movie, it would be The Rhino. “Rhino came to us for the role!” Arad said. The Rhino’s mechanized suit is entirely CG, but he wore a rig on set.
“[Paul] was so great to have on the set,” Tolmach said. “He just showed up and [was] about fun. This movie felt like we were now free in some ways to have fun, and to tell a bigger story—and a more tragic story. We were freed up of the obligations of origin. [We could] build a movie that we all really believed in and tell this big superhero opera. And that’s what you’re going to see more of going forward—the expansion of the universe with all these characters.”

9. CERTAIN SCENES WERE INSPIRED BY SILENT-ERA STARS.

Sony Pictures
Webb, Garfield, and stunt coordinator Andy Armstrong are all big fans of silent film stars like Buster Keaton and Charlie Chaplin, who performed physical comedy on-camera. This time around, they wanted some of that physicality to inform how Spider-Man (and Peter Parker) moved. “Sometimes, Spider-Man is witty and sometimes not—he’s trying his standup routine out on the criminals before he takes it to the comedy floor,” Garfield says. “The physical ability he has—we don’t want to just be punching and kicking and being cool. There’s something sort of trickster element that we wanted to capture.” They hired Cal McChrystal, the physical comedy director on One Man, Two Guvnors, to help come up with a few moments.
“It was dipping into a different kind of filmmaking and acting,” Webb says. “If you sit down and watch a Charlie Chaplin movie and don’t listen to the music—or if you play different music over it, like a Pixar soundtrack—it becomes accessible in a way that is profound. It becomes emotional and beautiful and there’s something really powerful there. That was an attempt to bring back vaudeville for a second, which is a lost art. It was one of those things where people watch and it goes by and it's as it should be. But it took a long time to do.”
Armstong watched a particular scene from one of Keaton’s shorts where the actor grabbed the back of a moving car and is whisked out of the scene almost horizontally; once he figured out how Keaton did it, they emulated it for a shot in ASM2.

10. THE PRODUCTION BUILT RIGS TO DO STUNTS PRACTICALLY.

Sony Pictures
A fight in a plane that kicks off the movie was accomplished mostly using actors and not stunt people. The crew built the interior of a G-5 plane and combined it with a motion base and two rings that could rotate the plane 360 degrees. They also used the rig in a later scene—inspired by Fred Astaire’s work in The Royal Wedding, in which the actor danced on the walls and ceiling—where Garfield rolls up onto the wall and walks along the ceiling, removing the Spidey suit. “All that stuff, people get a certain kind of pleasure from that,” Webb said. “It’s different from comedy, it’s different from action. it’s like watching people dance in a way. It’s physical virtuosity that people enjoy in a different kind of way.”
Garfield prefers to do his own stunts, but it's not always possible. "I used to be a gymnast and an athlete and it’s important to me—just like with every other aspect with the character—[that] I have some enjoyment of it," he said. "I don’t want to let it pass me by and watch somebody else play Spider-Man. I want to do it because it’s my only chance to really play it in a way that’s not just crawling up the doorway at my mum’s house. So I felt really stoked to get a chance. There’s me and there’s two stunt guys. It’s usually better man wins in terms of whatever stunt we’re doing. Sometimes it’s just the insurance risk is too high if I do them. If I die, the movie has to stop."

11. THE MOVIE WAS SCORED BY HANS ZIMMER … AND A FEW FRIENDS.

It was Webb’s idea for the Oscar-winning composer to form the supergroup that would create the music for The Amazing Spider-Man 2. The band called itself The Magnificent Six and featured Pharrell Williams, Johnny Marr (The Smiths), Michael Einziger (Incubus), Junkie XL, Andrew Kawczynski, and Steve Mazzaro. The idea, Zimmer told Billboard, is that “Peter Parker, is a kid, he's just graduating. If he had to listen to music and that was the way he expressed emotion, it wouldn't be big Wagnerian horns and Mahler strings. It would be rock ‘n' roll.”

courtesy by : mentalfloss

Sunday, 14 February 2016

The Evolution of Technology


The Evolution of Technology

Every baby born today in the Western world has a life expectancy of about 100 years, which means it will be alive in 2110. It’s nearly impossible to forecast in detail life in 2110. However, what we can venture to guess based on current trends is that humans will still populate the planet, as will animals, and we will be joined by simple biological creatures designed synthetically in the lab, and of course, machines. Machines will roam the earth, toiling in factories, taking our children to school, delivering babies, cleaning the streets, and other such tasks, which will make them seemingly indispensable to us.
We dont know how sophisticated these machines will be a century from today. Some might continue as dumb machines like the ones we have now, assiduously screwing on the caps of Coke bottles. Or they might be humanoid robots that resemble us and nurse our elderly parents. The increasing sophistication of Technology from the steam engine and discovery of electricity to telecommunications, the Internet and biotechnology can be seen as a haphazard confluence of the breakthroughs of geniuses -- or it can be seen as an evolutionary pattern.
Brian Arthur of the Santa Fe Institute believes that Technology evolves over time: “machines started as disparate pieces of seemingly unconnected technologies, but like humans, they also have an origin and a process of evolution.” He is arguably the first person to tackle the question of the origin and evolution of machines, eloquently laid out in his book, The Nature of Technology.  Evolution is an increase in maturation and complexity, and does not have to necessarily follow the path of Darwinian evolution, which is modification by descent – nature introduces small variations in an existing form over a long period of time. Granted the results are staggering, but the journey, such as that of the ape’s evolution into mankind, can take millions of years. 
Technology, according to Arthur, spawns new generations of products by using existing components, a phenomenon he calls combinatorial evolution. The change in ‘species’ can thus be quite radical in a short period of time. The greater the number of components we have at our disposable, the larger the number of permutations of new technologies that can be created, and the faster the evolution. The technology eco-system becomes alive with increasing possibility with the passage of time.
“Slowly, at a pace measured in decades, we are shifting from technologies that produced fixed physical outputs to technologies whose main character is that they can be combined and configured endlessly for fresh purposes. Technology, once a means of production, is becoming a chemistry.” (Brian Arthur)
There is yet another aspect to Technology’s evolution: technologies always capture a phenomenon (like using wind for power), with new phenomenon becoming available for capture with more powerful tools. Take the simple example of the fact that when you bend a flexible material, it stores energy. This phenomenon was used to create ancient tools like the bow and arrow. Today, we use all kind of phenomenon – optical, chemical, physical, and electrical to name a few – to create new technologies.
The rules of Technological evolution thus make a strong argument for accelerating evolution. Compared to the snail-paced evolution of the human species, we have to wonder if we’ll be able to manage the increasing complexity of technology or if the dystopian vision of some futurists will come true: machines will become ‘alive’ with artificial intelligence and not just roam the earth but also rule it.
Ayesha and Parag Khanna explore human-technology co-evolution and its implications for society, business and politics at The Hybrid Reality Institute.

Friday, 12 February 2016

Bob business promoter Effects



Check this links out : 

http://www.show.fully.pk/watch/qxnkfLv2nME

https://www.youtube.com/watch?v=qxnkfLv2nME

https://www.youtube.com/watch?v=jvzGx0WvEHg

https://www.youtube.com/watch?v=AJUWUZKn-_Q