Get instant access to must-read content today!To access hundreds of features, subscribe today! At a time when the world is forced to go digital more than ever before just to stay connected, discover the in-depth content our subscribers receive every month by subscribing to gasworld.Don’t just stay connected, stay at the forefront – join gasworld and become a subscriber to access all of our must-read content online from just $270. Subscribe
Aruna Quadri produced a comeback for the ages to defeat long-term rival and defending champion Omar Assar 4-3 in the semifinal of the 2020 ITTF Africa Top 16 Cup on Tuesday, 25 February in Tunis.Quadri’s victory over the Egyptian means that for the first time in two editions, the defending champion fails to progress to the final of the men’s singles.The tie between the two best players in Africa lived up to the billing with a healthy dose of suspense, incredible shotmaking and extensive rallies that left fans at the Rades Multi-Purpose Hall breathless at the end.Omar Assar entered the match boasting a 9-6 head-to-head advantage over the Nigerian and looked like extending it when he won the first two games 11-6, 11-5 to open up a 2-0 lead.Quadri, Africa’s top-ranked player, however, found his rhythm and eventually worked his way back into the game taking the next two games 11-4, 14-12 to restore parity.There was little to separate both players as spectators enraptured by the nailbiting tension-soaked encounter watched on as Assar went 3-2 up after winning the 5th game 11-7. The Egyptian was immediately pegged back by his Nigerian nemesis who produced some magical shots to level the tie after triumphing 11-5 in the sixth game.A gripping deciding set ensued and saw the Egyptian come to within a point of vanquishing his foe at 10-8 but Quadri showed incredible mettle to win the next three points to turn the tie in his favour.With the finish line in sight, Quadri called a time-out when leading 11-10, presumably to gather himself and still his racing heart, and it seemed to do the trick as an unforced error from Assar handed Quadri the win and a place in the final leaving Assar to rue his missed chances.Quadri will battle another Egyptian in the final after veteran Ahmed Saleh beat Senegal’s Ibrahima Diaw 4-1(14-12,11-7,3-11,11-9,11-7) in the other semifinal.Nigeria could make it a clean sweep in the men and women’s single with Offiong Edem also making the final after she beat Egyptian Helmy Yousra 4-2 (6-11,14-16,11-811-7,11-7,11-6) in the semifinal.She will contend with Yousra’s compatriot and defending champion Dina Meshref who beat Sarah Hanffou 4-2 (11-6,11-1,11-13,9-11,11-7,11-4) in a repeat of last year’s final.The winners of the tournament will represent Africa at the 2020 ITTF World Cups.Related
A round-up of the latest transfer speculation involving the area’s Championship clubs…Moussa Dembele’s proposed move from Fulham to Tottenham is off, according to Sky Sports News HQ.The two clubs recently agreed a fee in the region of £5m and Dembele was due to undergo a medical.AdChoices广告Sky say the Whites have insisted that Dembele return to Craven Cottage on loan for the rest of the season but Spurs refused.Dembele’s contract expires at the end of the seasonQPR do not want to sell Matt Phillips to West Bromwich Albion but could be forced to because he has refused to discuss a new contract, the Daily Star Sunday claim.West London Sport revealed during the summer that Rangers had rejected a bid from Albion for the winger and more recently revealed that the Baggies were still interested in a potential deal.Bournemouth and Watford have also shown interest in PhillipsRangers are keen to reach an agreement to sell Phillips and offload a number of other players in order to balance the books.However, the Star claim QPR do not want to do business but could be forced to, with West Brom expecting to sign Phillips for £8m.Meanwhile, a transfer-gossip website has claimed that QPR have made a move for Oxford United midfielder Kemar Roofe.And the Mirror say young R’s keeper Joe Lumley is wanted by Swindon, Colchester and Oldham on loan. Follow West London Sport on TwitterFind us on Facebook
In the middle of what many experts consider one of the wildest election seasons in American history, a handful of Southwestern Community College students have an opportunity to become a key part of the process.Eight members of Dr. Bucky Dann’s Social Problems class are researching local and statewide issues while preparing to ask candidates questions in a series of upcoming debates on SCC’s Jackson Campus.The first of these will be at 7 p.m. on Thursday, Sept. 29, and will feature candidates for the Jackson County Board of Commissioners: Democratic incumbents Vicki Greene and Mark Jones as well as Republican challengers Ron Mau and Mickey Luker.“I’ve been into some issues in the past, but now I’m able to learn more and get more involved,” said Alma Russ, a student from Whittier. “Local politics are where you can make a real difference. Being part of these debates is a little intimidating, but it’s also quite exciting.”Other debates at SCC this fall will feature candidates in the N.C. Senate race between Sen. Jim Davis (R) and challenger Jane Hipps (D) on Tuesday, Oct. 11, as well as the N.C. House race between Rep. Joe Sam Queen (D) and Mike Clampitt (R) on Tuesday, Oct. 25.The same N.C. House and Senate candidates participated in SCC’s inaugural debates two years ago.All debates will take place at 7 p.m. in the Burrell Building Conference Center.“It’s crazy to think we get to be a part of it this year by having debates here,” said Matthew Travers, a student who currently resides in Sylva. “All of the questions will be asked by my classmates and myself. It’s cool knowing that we’ll have a role, however small it may be, in the election.”In preparation for the debates, The Sylva Herald editor Quintin Ellison visited the class and discussed several of the key issues in local and statewide races.Dr. Dann said class sessions have been devoted to helping students set aside their own opinions on issues so that they can see all points of view and remove any bias from questions that will be asked.“It’s one thing to study about political issues on paper and to talk about them in class,” Dr. Dann said. “When you’re able to actually ask people for office where they stand on significant issues of the day, that takes the learning process to a whole new level. We are grateful all these candidates have agreed to participate, and we know these debates will be meaningful in helping voters make informed decisions when they go to the polls.”
Few companies have captured the world’s attention online in recent years as much as Twitter has. Rapid, structured, public communication between groups of people is not only a personal paradigm changer for many who have seriously explored the service – it’s also an incredible opportunity to analyze a rich and dynamic set of data about interpersonal conversation. First the Web, then email, then instant messaging and SMS all helped speed up the world we live in. Twitter made that rapid communication public and easier than ever for machines to mine for connections. Just as Facebook will never be Twitter because of the lack of clear access it offers outsiders to social data, so too does Twitter have its own limitations. A service called Status.net will launch in May that could overcome some of Twitter’s limitations and make a significant impact on the world we work in.Laconica, the Canadian company offering the most popular Open Source alternative to Twitter, announced plans today to begin selling subscriptions to hosted microblogging installations for businesses. The default address of these new sites will be yourname.status.net. We suspect that this could be a very big deal. (We found out about it from coverage on Microblink on Techmeme.)Step One, People Will Want It Laconica already allows anyone to install its software on their own servers, for free (see Leo Laporte’s Twit Army for example), but the easy paid offering from Status.net could catch on much faster. The service provider will be responsible for maintenance, upgrades will come automatically, the URL is clear and dignified and the fact that the software is open source could enable a plug-in and extension community to grow around the architecture as soon as it gets large enough for that to be viable.Companies will pay to have either public or private microblogging installations hosted and branded for them. They will do so because if they do not – their employees will have no group of allied professionals to securely cry out to for help with work problems. Their departments will remain out of touch and unfamiliar with the people and work being done around their own company. Companies without a microblogging system will seem as silly and disadvantaged in the future as companies do today that say “we don’t need Instant Messaging, we have email,” or “we don’t need email, we have a fax machine.”Step Two, People Will Build on ItSome companies will use the hosted Status.net platform, others will decide to put Laconica on their own servers and others still will decide to use some other provider’s business oriented but developer friendly microblogging service.Once that fundamentally structured layer of social conversation has spread throughout a substantial portion of the business world, hopefully as interoperable Open Source software, here’s what will happen.We discussed one of the most potent applications analyzing Twitter social connection data in a recent post titled The Inner Circles of 10 Geek Heroes on Twitter.These are the kinds of birds eye views through data parsing that an Open Source microblogging platform for businesses will enable. All of the following is based on nothing more than cross referencing user profiles, friend connections and public replies between users. Any parts of this vision that aren’t simple will be simpler for someone to build once there’s adoption and Open Source code.In private networks, a company will be able to receive automatic notification when one of its employees has begun conversing with another particular employee more than they had before. Perhaps they’ll consider putting them in the same work group. If one sales person doesn’t converse with the technical team as often as other sales people do, a company might wonder whether that salesperson is less comfortable explaining technical matters to customers. It will be trivial to determine which technical staff are friendliest and most appropriate to introduce a sales person to, because those kinds of connections will be fully graphable.In public business networks, community managers will be able to identify the customers most engaged in conversation with diverse groups of other customers with the snap of the fingers. Those are the kinds of community members that companies hire. Companies will be able to see if groups of people with similar traits in their profiles are asking for customer service more often than other groups, and when they seek to engage with those communities in order to improve product usability for them – the contours of that community will be easier than ever to understand.People say that the phrase Social Graph is too vague, but when it comes to structured, open microblogging – social connections through conversation and content are literally graphable. Here are the users, here are their friends, here are their public messages and here are their replies to one another – just drawn a line from one column to one row and a narrative will be formed by the data. Repeat that process and you’ll be able to build stories around trends.Is this creepy? It doesn’t have to be. There’s a whole lot of exciting potential here and if an increasingly open technology world can help the business world understand the value of open over control (as it is) then this kind of analysis could be democratized and used for good. Let’s look at this from the perspective of Twitter right now. When I’m away from my computer and think of a question I need answered, I can send that question out to my Twitter network by SMS. Three people might post a public reply answering my question. When I get back to Twitter, I see those three replies and I publicly thank one of those people in particular for providing such a good answer.Now repeat. Again and again, throughout an organization, across multiple organizations. Knowledge sharing paths get worn in the virtual grass of the public field of microblogging. Smart companies want their people creating those paths and only a fool would neglect an opportunity to illuminate these connections in the eyes of management.It won’t happen on Twitter alone, though. It’s too public, the company is too bound by its own limitations on how much data it really wants anyone else to pull out of the river of Tweets and relatively small groups are a very important part of the future of microblogging.We expect that hosted or free company-specific microblogging installations will become huge sources of Business Intelligence data and we hope that happens through interoperable, Open Source software. We’re excited to see what Laconica can do with Status.net. Tags:#Analysis#NYT#web Why Tech Companies Need Simpler Terms of Servic… A Web Developer’s New Best Friend is the AI Wai… marshall kirkpatrick Related Posts 8 Best WordPress Hosting Solutions on the Market Top Reasons to Go With Managed WordPress Hosting
How would you measure your comfort, user experience, smoothness, and happiness while producing music?Intel® Optane™ SSDs open a full horizon of new application usages and use cases. But how would you translate your device-level performance into an application performance improvement? And how would that be translated into the user experience improvements — the ultimate goal of any technology progress? Well, that’s a question I ask myself while evaluating new technologies. In most conditions that can be measured by benchmarks, pure comparing scores or runtime could mean an advantage of one technology over another one. In certain cases that can be just tangible, such as how would you measure the smoothness of your experience or how would you score your feelings? Well that’s more difficult, as everyone can have a different perspective. In this blog I’ll attempt to make some formal assessment of those feelings based on the recent story. If you haven’t had a chance to see Intel’s interview with top electronic music and film composer, BT, find a moment now. It’s worth it!BT is one of the most innovative musicians who utilizes newest technologies in his music production and creates his own. His work for movie scoring is impressive (The Fast and the Furious, Solace, Stealth), and uses the latest advantages of massively sampled orchestration available in real-time. While sampling has existed for years, the way he pushes it to the limits with hybrid orchestra approach and granular synthesis is quite remarkable.As a user of Intel® SSDs 750-series, he was excited by NVMe SSDs and the performance advantages PCIe interface brings into that. Combining multiple SSDs in the RAID volume allows him to improve the overall bandwidth and, of course, expand the capacity. That’s a great deal, and RAID capability is built in all operating systems today. However, RAID can’t improve the access latency. No matter how many drives you combine together, the access latency would represent the worst drive in the array. That means it’s always equal or higher than a standalone SSD latency. There is a class of applications that can’t keep up scaling the performance by only SSD bandwidth improvements and that story is a demonstration on that. Device latency is one of those requirements for the audio sample playback performance improvements.A complete orchestra is sampled into terabytes of a sample data with a playback of up to 3,000 tracks at a time. Available DRAM is only capable for the small pieces of those sounds (attacks), while the body of the sound is streamed directly from storage. For real-time playback, it is critical all data processing is completed within an audio buffer time — say 5ms, which is common latency these days. Otherwise the user will experience audio drops and other artifacts, including fatal interruptions. This is the case where scaling storage bandwidth can’t help to solve the problem.Let’s look at the facts. A single sample is a contiguous piece of a data. Let’s assume each sample is running at 48kHz * 32bits Stereo, which is translated into 0.37 MB/s bandwidth. You would expect that with PCIe SSD, which as an example can read data sequentially at 2.5GB/s, you can play ~7M samples at a time (2.5*1024*1024/0.37). Why would I ever need faster storage if this number far exceeds any real use case? Well, the conclusion is wrong. Sample libraries are based on the thousands of samples played at a time. Different layering, microphone position, and round robin sample rotation are multiplying that by the order of magnitude. Also, streaming of many sequential fragments at a time causes I/O randomization naturally. Now, a workload is randomized with a lowest denominator, which is an application request size or even file system sector size in a common case. With that the storage workload is no longer sequential and must be measured in the IOPS form on a small block size. This is fully random I/O condition for the device-level perspective and it’s distributed across full span of sample library with no hot area.Here we came to the point where NAND-based SSD performance has significant variation based on workload parameters. That’s easier for a drive to run a single threaded sequential workload than a random one or even than many parallel sequential. Of course, the difference is not as noticeable as with hard drives, where you must physically move a head, which has significant latency impact on results and unbelievable performance degradation. But the performance impact is meaningful, too. The root cause is in the NAND architecture, which consists of sectors (minimal read size), pages (# of sectors, determines minimal write size) and erase block size (# of pages, minimal erase size). Combined with a specific NAND-based SSD acceleration on aggregating sequential I/O into a bigger transfer size, we see performance improvements in sequential I/O, which are not available for Random small block I/O.A 3D XPoint™ memory cell solves that problem. It’s cache line addressable by the architecture, requires no erase cycle before write, and significantly lowers access time compared to NAND. Implemented on a block device, Intel Optane™ SSDs are optimized for a low latency and high IOPS performance, especially on low queue depth. This directly correlates with an exceptional quality of service, which represents max latency and latency distribution. As a consequence of that, Optane SSD is capable of delivering similar performance no matter the workload — random vs. sequential or read vs. write.Let’s run some tests to visualize that. I’ll be running this experiment on Microsoft Windows 10. You may expect Linux or OS X charts similar or better, but as we’re evaluating an environment similar to the one installed in BT’s studio, I’ll try to match it here.Configuration: Asus X299-Deluxe, Core i7-7820X, 32GB DRAM, Intel SSD 750 Series 1.2TB, Intel Optane 900p. You may download all FIO configuration scripts from my repository: www.github.com/intel/fiovisualizerNAND-based SSD is in the sustained performance state before every run. Optane SSD doesn’t have this side effect and delivers performance right away. As you see on charts, I’m only considering a scenario of the I/O randomization, and the overall delta in absolute SSD performance under different conditions. I’m leaving other workloads to the side, which are evaluated thoroughly by a third party such as Storage Review, Anandtech, PC Perspective, and others. All of the simulated workloads are stressful for a SSD, in regards of getting to the maximum performance of the device by pushing many I/Os. Intel Optane SSD leads not only on the absolute numbers, but also on the performance variability between workloads. In a real application scenario, such as in the story above, that means stable and predictable performance for a sample playback that doesn’t change its characteristics based on the number of samples, their sizes, the way they are played or any other activities while doing that, such as multitrack record. You may call it “a performance budget” you can split between workloads without sacrificing overall performance.For a musician that means Optane delivers a smooth experience without audio drops, even at peak demands. That also means no need for the offline rendering, channel freezing and sub mixdowns, which equals more time for being creative and unique.
Atletico Madrid coach Simeone: We need Lucas hereby Paul Vegas9 months agoSend to a friendShare the loveAtletico Madrid coach Diego Simeone insists he’s counting on Lucas Hernandez for the remainder of this season.The defender is a confirmed target for Bayern Munich, while he is also being linked with Manchester United and Manchester City.But ahead of today’s win over Levante, Simeone said: “He’s part of the group and we’ll decide the morning of the match. “He is very good, he is a strong guy and I hope he will come back as he left when he had to go out against Alavés. “He is a very important player for us, he gives us depth on the left side when he plays this role and we need him.” TagsTransfersAbout the authorPaul VegasShare the loveHave your say
Another day, another college football top 25 released. This time – it’s by the knowledgeable team over at Pro Football Focus, one of the most respected analytical football sites on the web. It won’t surprise you to hear that they’re going with Alabama, the reigning national champion, as the No. 1 team heading into 2016. Alabama is followed by another College Football Playoff participant from last year – Oklahoma. LSU, Clemson and Michigan round out the top five. Stanford is the highest-ranked Pac-12 team, at No. 7. Here’s the entire top 10. You can check out all 25 ranked programs over at PFF.1. Alabama 2. Oklahoma 3. LSU 4. Clemson 5. Michigan 6. Florida State 7. Stanford 8. Ohio State 9. Tennessee 10. Ole MissAll in all, there are six SEC teams, six Pac-12 teams, three Big Ten teams, three ACC teams, four Big 12 teams, one independent and two non-power five teams. College football fans – do they have it right?[Pro Football Focus]
“We appreciate the joint effort between the Province and the Government of Canada to help rebuild our critical infrastructure,” said Taylor Mayor Rob Fraser. “To receive approximately a million dollars for our rural community is a tremendous support for our sewer infrastructure. This funding will help the future needs of our community, and I want to thank the Province and the federal government for this significant contribution.” The lift station has a pumping capacity of around 22 litres per minute. The station, which has been operational since February, pumps an average of 320 cubic metres of wastewater per day. The project is the third of four phases of the District’s sewer upgrades.“The Government of Canada is very proud to have contributed to the construction of this new lift station for the District of Taylor,” said Infrastructure and Communities Minister Amarjeet Sohi. “We are committed to investing in infrastructure that helps build cleaner, stronger and more sustainable communities. This project is an impressive example of a system that will better serve residents while supporting future growth and protecting the environment for years to come.” “The safe movement and treatment of wastewater is not something people always want to talk about, but it is essential for the health and wellbeing of communities,” said Municipal Affairs and Housing Minister Selina Robinson. Thanks to this partnership between all levels of government, people in Taylor have a new, efficient lift station that will improve the way they treat wastewater in their community for years to come.” TAYLOR, B.C. – The District of Taylor held a ribbon-cutting ceremony this afternoon for a new $1.2 million wastewater lift station facility.Lift Station 3 was funded by contributions from both the federal and provincial governments, as well as from the District itself. Taylor contributed just over $400,000 towards the facility’s cost. The Province pitched in $386,100, while the feds’ contribution was $585,000. Taylor’s new lift station. Photo by Chris Newton
An official close to the plan says that includes possibly hiring a former senior judge, possibly a retired Supreme Court of Canada justice, to advise on what would constitute meaningful consultation with Indigenous communities to satisfy the conditions of the court.The Liberals intend to announce the next steps before the end of September.The government wants to have the pipeline’s fate decided within the next six to eight months before the next federal election and also before Alberta’s provincial election in May. OTTAWA, O.N. – A senior federal source says the Liberals are considering hiring a former top judge to guide a renewed consultation with Indigenous communities on the Trans Mountain pipeline expansion.The Federal Court of Appeal last month quashed the approval given to the project, saying the consultation with Indigenous communities wasn’t good enough and criticizing the lack of attention paid to the environmental impact of increased tanker traffic off the coast of British Columbia.The Liberals are still considering whether to appeal the decision, but are looking at how they can do what the court said was lacking in order to get the pipeline work back underway.